Skip to content

Commit

Permalink
Merge pull request #85 from alipay/dev_chongshi_2
Browse files Browse the repository at this point in the history
Improvement: tweak comments in the prompt_util file
  • Loading branch information
LandJerry authored Jun 14, 2024
2 parents 3bb6088 + 1d69041 commit 8276b71
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions agentuniverse/base/util/prompt_util.py
Original file line number Diff line number Diff line change
Expand Up @@ -164,8 +164,8 @@ def process_llm_token(agent_llm: LLM, lc_prompt_template, profile: dict, planner

input_tokens = agent_llm.max_context_length() - agent_llm.max_tokens
if input_tokens <= 0:
raise Exception("The `max_tokens` in the llm configuration is the maximum output number of tokens, "
"the current `max_tokens` is greater than the context length of the LLM model.")
raise Exception("The current output max tokens limit is greater than the context length of the LLM model, "
"please adjust it by editing the `max_tokens` parameter in the llm yaml.")

if prompt_tokens <= input_tokens:
return
Expand Down

0 comments on commit 8276b71

Please sign in to comment.