-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default chunking in GeoTIFF images #2093
Comments
There is precedent for auto-aligning dask chunks with the underlying
dataset chunks. This is what we do with the `auto_chunk` argument in
`open_zarr`:
http://xarray.pydata.org/en/latest/generated/xarray.open_zarr.html#xarray.open_zarr
…On Mon, Apr 30, 2018 at 12:21 PM, Matthew Rocklin ***@***.***> wrote:
Given a tiled GeoTIFF image I'm looking for the best practice in reading
it as a chunked dataset. I did this in this notebook
<https://gist.github.com/mrocklin/3df315e93d4bdeccf76db93caca2a9bd> by
first opening the file with rasterio, looking at the block sizes, and then
using those to inform the argument to chunks= in xarray.open_rasterio.
This works, but is somewhat cumbersome because I also had to dive into the
rasterio API. Do we want to provide defaults here?
In dask.array every time this has come up we've always shot it down,
automatic chunking is error prone and hard to do well. However in these
cases the object we're being given usually also conveys its chunking in a
way that matches how dask.array thinks about it, so the extra cognitive
load on the user has been somewhat low. Rasterio's model and API feel much
more foreign to me though than a project like NetCDF or H5Py. I find myself
wanting a chunks=True or chunks='100MB' option.
Thoughts on this? Is this in-scope? If so then what is the right API and
what is the right policy for how to make xarray/dask.array chunks larger
than GeoTIFF chunks?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#2093>, or mute the thread
<https://github.com/notifications/unsubscribe-auth/ABJFJoFj8_VGvsJ5oZQD4WTgW8Xx3lSyks5ttzoKgaJpZM4Ts2Hm>
.
|
My guess is that geotiff chunks will be much smaller than is ideal for dask.array. We might want to expand those chunk sizes by some multiple. |
#1440 is related but more focused on netcdf. |
Most of the standard internal chunked (or what I believe to be called 'tiled' by the GIS community) is 256x256 (see: http://www.gdal.org/frmt_gtiff.html TILED=YES BLOCKXSIZE=n and BLOCKYSIZE=n). This is used when viewing images within a given region of interest or window. You can really tell the difference in speed between the tiled and stripped images (which has a blocksize 1x). @mrocklin, I agree that we might want to aggregate some number of them, but we would need to get some automation up front and sort out how we want to determine the expansion. Adding to the #1440 discussion mentioned, there will likely be advantage in increasing the block sizes in given directions. |
With the benefit of almost a year's worth of procrastination, I think the best approach is to take the heuristics from #1440, but only support The underlying logic for this issue would be identical to that of #1440, so supporting both is "just" a matter of plumbing it in correctly. |
that would definitely work for me.
…On May 7 2018 6:43 AM, Zac Hatfield-Dodds wrote:
With the benefit of almost a year's worth of procrastination, I think
the best approach is to take the heuristics from #1440, but only
support `chunks=True` - if a decent default heuristic isn't good
enough, the user can specify exact chunks.
The underlying logic for this issue would be identical to that of
#1440, so supporting both is "just" a matter of plumbing it in
correctly.
|
one of the issues related to this has been closed. Has a default GeoTIFF chunk been implemented? |
No, unfortunately. |
On Jun 18 2018 4:03 PM, Fabien Maussion wrote:
> Has a default GeoTIFF chunk been implemented?
No, unfortunately.
ok. Maybe the overall chunking issue has been sorted. I will try to
look into this and see what is working now related to this issue.
|
In order to maintain a list of currently relevant issues, we mark issues as stale after a period of inactivity If this issue remains relevant, please comment here or remove the |
Given a tiled GeoTIFF image I'm looking for the best practice in reading it as a chunked dataset. I did this in this notebook by first opening the file with rasterio, looking at the block sizes, and then using those to inform the argument to
chunks=
inxarray.open_rasterio
. This works, but is somewhat cumbersome because I also had to dive into the rasterio API. Do we want to provide defaults here?In dask.array every time this has come up we've always shot it down, automatic chunking is error prone and hard to do well. However in these cases the object we're being given usually also conveys its chunking in a way that matches how dask.array thinks about it, so the extra cognitive load on the user has been somewhat low. Rasterio's model and API feel much more foreign to me though than a project like NetCDF or H5Py. I find myself wanting a
chunks=True
orchunks='100MB'
option.Thoughts on this? Is this in-scope? If so then what is the right API and what is the right policy for how to make xarray/dask.array chunks larger than GeoTIFF chunks?
The text was updated successfully, but these errors were encountered: