-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
'request timeout' error occurred during the text segmentation and cleaning process in the knowledge base #11694
Comments
Please provide the container logs. |
I found some similar threads that might help you resolve the 'request timeout' error during the text segmentation and cleaning process:
These threads suggest several steps you can take to troubleshoot and potentially resolve the timeout issue. You might want to start by checking the status of your services and ensuring that your system meets the necessary requirements for handling large file uploads. To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
|
api and worker |
@crazywoola
|
It seems there is something wrong with the tongyi models and exceeds the rate limit which we can not fix this for you.
Which means you need to upgrade your machine. |
It’s impossible! This machine is quite high-spec, with 128GB of memory and 4*3090 GPU, each with 24GB of VRAM. And it’s not running any other programs, only Dify. |
As indicated in the logs, there is no queuing mechanism, so increasing UPLOAD_FILE_BATCH_LIMIT enables parallel execution. Since UPLOAD_FILE_BATCH_LIMIT has been increased 100-fold from 5 to 500, the model's limits and memory would also need to be increased 100-fold... It is recommended to register knowledge sequentially via the API. |
Self Checks
Dify version
0.13.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
I changed the UPLOAD_FILE_SIZE_LIMIT to 1024 and the UPLOAD_FILE_BATCH_LIMIT to 500, then uploaded 60 PDF files, each ranging from 20MB to 160MB in size. After the upload was completed, a 'request timeout' error occurred during the text segmentation and cleaning process. How can I resolve this? The error is as follows:
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered: