-
Notifications
You must be signed in to change notification settings - Fork 67
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Too many open files #11
Comments
This is a general issue with lazily reading large numbers of files. It happens also with xarray on netcdf files (pydata/xarray#463). The root of the problem is that your system limits the number of open files that are allowed. You can change these "ulimits", which is probably the best option for you: |
Hi Ryan, thanks for the feedback. Silly enough, but I didn't know about ulimits. I was guessing that a python fix would not be trivial, and you confirm it. |
Several people are hitting this issue, so I want to leave it open. Some proposed developments in xarray might allow us to avoid opening the files until the data is requested. |
Since #25, the data are loaded using |
Hi,
I am getting an issue with memmaps and a long simulation (several .data files).
The issue is known (see e.g. this question) but I don't seem to understand how to properly close the memmaped files, as I don't really understand yet how it interfaces with xarray.
On the practical side, I am able to work by setting
use_mmap=False
The text was updated successfully, but these errors were encountered: