Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to reduce prompt? #2469

Closed
luoshu999 opened this issue Apr 19, 2023 · 6 comments
Closed

how to reduce prompt? #2469

luoshu999 opened this issue Apr 19, 2023 · 6 comments

Comments

@luoshu999
Copy link

Command browse_website returned: Error: This model's maximum context length is 8191 tokens, however you requested 12227 tokens (12227 in your prompt; 0 for the completion). Please reduce your prompt; or completion length?

@Notarin
Copy link

Notarin commented Apr 19, 2023

Hi! what is you environments configuration? Have you set up a .env? If so, what is your token limit configured for?

@TheGreenMonkeys
Copy link

I have the same issue...
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, you requested 39997 tokens (964 in the messages, 39033 in the completion). Please reduce the length of the messages or completion.
Press any key to continue...
I actually played around with these two parameters in the .env file but increasing them doesn't work as the limitation comes from the model.
FAST_TOKEN_LIMIT=4000
SMART_TOKEN_LIMIT=8000

I tried to make auto-gpt change the code of a cloned version of hisself and also directly improve his code but it always fail by not having such an error like luoshu999

@timlincool
Copy link

timlincool commented Apr 19, 2023

NEXT ACTION: COMMAND = search_files ARGUMENTS = {'directory': '/home/123/Auto-GPT/auto_gpt_workspace/code'}

openai.error.InvalidRequestError: This model's maximum context length is 8191 tokens, however you requested 157096 tokens (157096 in your prompt; 0 for the completion). Please reduce your prompt; or completion length

ubuntu 20.04 on lastest master

@Jpsanchezasmalljob
Copy link

same here openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, you requested 13997 tokens (1355 in the messages, 12642 in the completion). Please reduce the length of the messages or completion.

@Jens-AIMLX
Copy link

same here

@Pwuts
Copy link
Member

Pwuts commented Apr 22, 2023

This has been resolved for browse_website with #2542. search_files is still broken, and tracked here:

@Pwuts Pwuts closed this as completed Apr 22, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants