-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Chunks iterators #45
Comments
I am willing to help with that. How to get started? |
There hasn't been any requests for this, but we an always make some API for reading a variable along a dimension. I think this needs some ideas for how best to create the iterator, and what would be the most useful for the end user. Maybe you have some insight? An idea is to allow chunking along a dimension (e.g. time) and let each |
Having a lazy loader both in space and time is IMO mandatory. In alot of my own usecases the files are larger than RAM, while i only need very local in the spacetime sense values. Julias NCdatasets already does this (so I know where to "steal" ideas). I want to migrate to rust so I will try to implement a lazy loader for myself anyways. |
@krestomantsi and I been looking at this issue, how do I build the netcdf package locally? Can you also clarify the type annotation written in this issue? |
It should be as simple as cloning this repository and running |
|
Regarding the API, would you basically want it to be like a slicing function so that you say from lat X => Y and long X => Y and return the matrix that contains these values as a view? Or is it supposed to throw all the values into a matrix of the correct shape |
You will also need to pull the submodule by |
When reading values from a variable, it should be possible to get a lazy-loading iterator over chunks.
Implementation details:
new method: fn values_chunked(start, chunklength, &mut buffer) -> ChunkIterator { }
struct ChunkIterator {
start: ()
buflen: ()
buffer: ()
}
The text was updated successfully, but these errors were encountered: