-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
File Support For Qwen-long #10576
Comments
I couldn't find specific information about Dify's support for the file-id mode with Qwen-Long for handling large token inputs. However, there is a related discussion about errors when using large language models, which might provide some insights or similar experiences from other users [1]. Additionally, there is a closed issue related to token limits that might be relevant [2]. If you're encountering issues with token limits, you might want to check the configuration settings in Dify that can be adjusted to support larger token inputs. Specifically, you can modify parameters like |
There are few things you might misunderstood.
|
Self Checks
Dify version
0.11.0
Cloud or Self Hosted
Cloud
Steps to reproduce
1、add a llm node using Qwen-Long
2、passing a long text into system
3、trigger the node,but fail like
since qwen-long support 10000k tokens and like the doc
if the input text longer than 1m token,it will failed,and suggest using file-id mode to enable large tokens,but dify llm node seems not to support this.
✔️ Expected Behavior
using Qwen-Long with large tokens running successful
❌ Actual Behavior
fail with large tokesn
The text was updated successfully, but these errors were encountered: