You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When using imread, I get an image with only one chunk, even though the .tif image I'm reading has chunks. This can lead to very high memory usage, as sometimes we work on images of sizes up to 2TB (as mentioned in these issues: 1, 2)
To reproduce the examples below, you can download one of these large .tif files. With this image of size 20GB, even accessing a small portion of the data will require loading the full image in memory with dask_image, while using rioxarray requires 40x less RAM, and is 4500x faster (see examples below).
from your code it seems that your file is chunked with chunksizes of (1, 1024, 1024), however dask_image.imread only supports chunking along the first axis for now.
Thanks for posting the issue you ran into, we should probably document this better. Since there are many readers that return properly chunked dask arrays (and loading strongly depends on the file format and reader), this functionality currently has rather low priority in dask-image.
Hello,
When using
imread
, I get an image with only one chunk, even though the.tif
image I'm reading has chunks. This can lead to very high memory usage, as sometimes we work on images of sizes up to 2TB (as mentioned in these issues: 1, 2)To reproduce the examples below, you can download one of these large
.tif
files. With this image of size20GB
, even accessing a small portion of the data will require loading the full image in memory withdask_image
, while usingrioxarray
requires 40x less RAM, and is 4500x faster (see examples below).Case 1: using
dask_image
Max RAM usage:
34.244 GB
Time:
22.4 s ± 247 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
Case 2: using
rioxarray
Max RAM usage:
0.765 GB
Time:
5.04 ms ± 140 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)
The text was updated successfully, but these errors were encountered: