-
Notifications
You must be signed in to change notification settings - Fork 61
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Javascript part allocates the entire file in memory at stream creation #139
Comments
This is a regression introduced in this commit. I think I ran a memory analysis and everything, but either I failed to recognize this error, or chrome has changed the way the buffer is allocated. in any case, a glaring bug, should be easy to fix. Nice catch. |
@jespersh Please let me know if you have the time to test and give feedback on this. I guess its always possible to do better, this is a tight loop. But one thing is for sure, going away from the model that caused this bug, it's a 500% slow-down. Not very noticable on small files, but quite painful to go from 1sec to 7sec for a 800mb file. |
I'll try to dig a bit into this as soon as I can, but I am wondering if you could reuse the How big are your read chunks? I'd test with ~32KB |
Finally got the time to make some tests. My experiments show that FileReader instanciation is basically free, no impact what so ever. It's probably cached. Chunk size has a huge impact on speed, and a small impact on ram usage. No matter the chunk size, ram usage is 80-150MB over rest during the process. I`m testing with a 800MB file. Using a chunk size of 82KB -> 16seconds, 330KB -> 5s. So my conclusion basically what is costly here is the asynchronous callback, which I to my knowledge have no way of avoiding. I could possibly implement a second level of buffering that could be configured somehow, but I' stopping here for now in favour of other features. |
Describe the bug
I'm trying to create a "chunk" stream for
System.Net.Http.StreamContent
without the browser allocating GBs of memory for the entire file. Sending native in browser doesn't have this behavior and testing with a console application neither have this behavior.The native test:
To Reproduce
Any of these with a multi GB file allocates the entire file into memory:
Using CreateMemoryStreamAsync
Using OpenReadAsync:
Expected behavior
The call to
ReadAsync
decides how much memory is allocatedScreenshots
Project type
Client-side/CSB
Environment
Browser: new Edge with Chromium
BlazorFileReader: 1.5.0.20109
.net SDK: 3.1.301
.net host: 3.1.5
Additional context
A possible fix could be this: https://stackoverflow.com/a/28318964
The text was updated successfully, but these errors were encountered: