Limit Input Token in chat #1775
Replies: 3 comments 2 replies
-
This depends on how critical it is that you stay exactly below 100 tokens. The cheap and easy way to handle it would be to take your model's average tokens per word estimate and then check how many words exist while the user is inputting text and cap when they are over the 100 token estimate. Another way would be to debounce requests to the model's api as the user inputs text and check how many tokens it's going to use. Then cap the input once they exceed 100 tokens. |
Beta Was this translation helpful? Give feedback.
-
I would like to at least limit input tokens to let's say 1024 tokens or so. As of now it seems users can just put in a ton of text, being prone to abuse. I found out after I dumped a 67k character wikipedia page in the chat and it happily devoured it... |
Beta Was this translation helpful? Give feedback.
-
Hi @Jaredude @Daniel-DDV , I've implemented a new functionality in the pull request in FlowiseAI/FlowiseChatEmbed#179. This feature allows us to set a limit on the number of input words a user can enter in a chatbot, please wait for its approval. |
Beta Was this translation helpful? Give feedback.
-
In Flowise, I want to add a feature where users can only enter a certain limit of data. If a user enters more than 100 tokens per input, the input should automatically be reduced to 100 tokens at the start. Does anyone have any ideas on how to achieve this?
Beta Was this translation helpful? Give feedback.
All reactions