-
Notifications
You must be signed in to change notification settings - Fork 89
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Out of memory: Killed process xxxx (gcsf) #77
Comments
That's true, and unfortunately there's no easy fix since this issue stems from a design decision. |
Ok. |
I'll take a look into the source code when I get to it. On a Raspberry PI this is a real issue since there is not lot of memory available. I have a 750MBit fiber line so mounting a Google drive would make sense for storage that's not speed critical. (I'm paying a lot annually for that 2TB anyway so buying more storage seems a waste of money.) |
I think rclone chunks the files every 10mb to avoid this issue. Is it maybe possible to implement chunks in gcsf too? |
It might be a possible solution to explore. For now you might be able to get around by setting a trivial cache limit ( |
While copying large (4GB) file from mounted folder to my local disk I am seeing OOM kill.
Looks like gcsf is keeping all files in memory.
I tried reducing values of cache_max_seconds , cache_max_items and cache_statfs_seconds but noo luck.
The text was updated successfully, but these errors were encountered: