-
-
Notifications
You must be signed in to change notification settings - Fork 4.5k
[BUG] When used for long periods of time, responses become truncated #519
Comments
It's because of a 4000 token cutoff |
At that point, it only has an 800 token buffer |
You can ask it to continue |
I can add the ability to define the buffer space if necessary |
Could you add a way to delete some of the previous tokens, or allow for it to delete the needed token space (I guess just deleting a token for every new one it writes). A defined buffer space would also be a helpful setting as rn I'm not using much ram at all. |
Also thanks for replying so quickly, 4am for me didn't expect to get a response lmao |
On the latest commit, you can define the buffer space (in tokens) when initializing the Chatbot class. You can also remove history manually with |
Thx |
problem with loading convos, for some reason it comes up with
When trying to ask a question, problem with prompt loading maybe? |
I loaded the conversation from a dumped json, so that's probably part of it |
It seems I have made some mistakes with saving/loading conversations. Haven't tested. Will do tomorrow |
Edit: Nevermind. I can't subtract if none. need more checking code. Fixing now... |
Description
When using a single ChatBot object instance for a while I notice it's reponses become strangely cutoff
Steps to Reproduce
I don't know what the exact issue is, but I running it through a discord bot for users to chat through, but after a while it's reponses become cutoff. I am using the "offical" version with the non-browser hosted version of the ai.
Expected behavior
it doesn't cutoff it's descriptions
The text was updated successfully, but these errors were encountered: