-
-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multiple uploads resulting in CANCELED #25756
Comments
Please try again with the network tab open and inspect the underlying failure. Ctrl Shift I > Network > then try |
I checked the debugger and saw that I am triggering the timeout at https://github.com/matrix-org/matrix-js-sdk/blob/acbcb4658a5d5903dfd557e3e115241d0a6f38bb/src/http-api/index.ts#L102-L111. The upload only takes a few seconds but the bar sits at 100% for a bit before the timeout kills it. Element Desktop HAR. You can read through the linked Synapse issue to get up to speed. |
That implies that there was no XHR progress for 30 seconds, https://github.com/matrix-org/matrix-js-sdk/blob/acbcb4658a5d5903dfd557e3e115241d0a6f38bb/src/http-api/index.ts#L102C24-L102C24 resets the cancellation timeout on any progress. I would argue that no progress on the upload in 30s then something is up with the connection. |
I saw that the uploaded file appears in the media store which means that Synapse received the file correctly. Is there something with the communication between Synapse and Element around upload completion that would lead to this canceled issue? |
Yes seems like either Synapse or your reverse proxy or any proxies inbetween hang for >30s at the end of the upload not sending any progress or closing the connection & sending a status code. |
Okay, I'll do some more digging into my setup. |
Okay, I have some new logs to support my position that this is an Element issue. Here's the final part of the log where the client cancels the upload because it hits its internal timeout: Log
Here are the important bits: The log shows that the client prematurely closed the connection:
We already know this is happening, so nothing new here. But, take a look at this line right before the
The server is still in the process of receiving the request body when the client closes the connection. If I understand Nginx debug logging, |
@Cyberes while you have the logging enabled could you compare it to a working 1-file upload? |
Here you go. It's the complete process, all 3000 lines. (Hopefully I removed all my personal information) |
There is one line that shows
Which I don't think matters since Nginx sent Could there be something weird going on where the client isn't cleanly closing the connection and its only causing issues with large file uploads? Here is the end of the failed 35 MB video upload:
Vs. the end of the small image file:
|
@t3chguy I know its been a few months, but I'm still having this issue on version 1.94.0. |
That isn't an Element Web/Desktop version. |
I'm referring to the Synapse version. Anyway, I disabled Cloudflare for I've never encountered an issue like this with CF. Do you know of any conflicts that may be occurring between Element -> Cloudflare -> Synapse? |
Steps to reproduce
I am attempting to upload two video files (each about 30 MB) to a room at the same time. This always fails with a generic error message popup (The file '[filename]' failed to upload.). There is no error in the console and the network of the debug window shows the fetch request as (canceled). Synapse logs don't show any errors and neither does Openresty. Uploading the files one at a time works, but uploading them together always fails.
We've investigated the Synapse side of things and everything seems to be working -> matrix-org/synapse#15841
Outcome
What did you expect?
Files are uploaded.
What happened instead?
Element said the upload failed and the request in the network console showed
(CANCELED)
.Operating system
Ubuntu
Application version
1.11.34
How did you install the app?
Repository
Homeserver
matrix.evulid.cc
Will you send logs?
Yes, but I didn't find any relevant logs during my investigation.
The text was updated successfully, but these errors were encountered: