Description
What is your issue?
I recently tried to use xarray to open some netCDF files stored in a bucket, and was surprised how hard it was to figure out the right incantation to make this work.
The fact that passing an fsspec URL (like "s3://bucket/path/data.zarr"
) to open_dataset
"just works" for zarr is a little misleading, since it makes you think you could do something similar for other types of files. However, this doesn't work for netCDF, GRIB, and I assume most others.
However, h5netcdf does work if you pass an fsspec file-like object (not sure if other engines support this as well?). But to add to the confusion, you can't pass the fsspec.OpenFile
you get from fsspec.open
; you have to pass a concrete type like S3File
, GCSFile
, etc:
>>> import xarray as xr
>>> import fsspec
>>> url = "s3://noaa-nwm-retrospective-2-1-pds/model_output/1979/197902010100.CHRTOUT_DOMAIN1.comp" # a netCDF file in s3
You can't use the URL as a string directly:
>>> xr.open_dataset(url, engine='h5netcdf')
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
...
FileNotFoundError: [Errno 2] Unable to open file (unable to open file: name = 's3://noaa-nwm-retrospective-2-1-pds/model_output/1979/197902010100.CHRTOUT_DOMAIN1.comp', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)
Ok, what about fsspec.open
?
>>> f = fsspec.open(url)
... f
<OpenFile 'noaa-nwm-retrospective-2-1-pds/model_output/1979/197902010100.CHRTOUT_DOMAIN1.comp'>
>>> xr.open_dataset(f, engine='h5netcdf')
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
...
File ~/miniconda3/envs/xarray-buckets/lib/python3.10/site-packages/xarray/backends/common.py:23, in _normalize_path(path)
21 def _normalize_path(path):
22 if isinstance(path, os.PathLike):
---> 23 path = os.fspath(path)
25 if isinstance(path, str) and not is_remote_uri(path):
26 path = os.path.abspath(os.path.expanduser(path))
File ~/miniconda3/envs/xarray-buckets/lib/python3.10/site-packages/fsspec/core.py:98, in OpenFile.__fspath__(self)
96 def __fspath__(self):
97 # may raise if cannot be resolved to local file
---> 98 return self.open().__fspath__()
AttributeError: 'S3File' object has no attribute '__fspath__'
But if you somehow know that an fsspec.OpenFile
isn't actually a file-like object, and you double-open
it, then it works! (xref #5879 (comment))
>>> s3f = f.open()
... s3f
<File-like object S3FileSystem, noaa-nwm-retrospective-2-1-pds/model_output/1979/197902010100.CHRTOUT_DOMAIN1.comp>
>>> xr.open_dataset(s3f, engine='h5netcdf')
<xarray.Dataset>
Dimensions: (time: 1, reference_time: 1, feature_id: 2776738)
Coordinates:
* time (time) datetime64[ns] 1979-02-01T01:00:00
* reference_time (reference_time) datetime64[ns] 1979-02-01
* feature_id (feature_id) int32 101 179 181 ... 1180001803 1180001804
latitude (feature_id) float32 ...
longitude (feature_id) float32 ...
...
(And even then, you have to know to use the h5netcdf
engine, and not netcdf4
or scipy
.)
Some things that might be nice:
- Explicit documentation on working with data in cloud storage, perhaps broken down by file type/engine (xref improve docs on zarr + cloud storage #2712). It might be nice to have a table/quick reference of which engines support reading from cloud storage, and how to pass in the URL (string? fsspec file object?)
- Informative error linking to these docs when opening fails and
is_remote_uri(filename_or_obj)
- Either make
fsspec.OpenFile
objects work, so you don't have to do the double-open, or raise an informative error when one is passed in telling you what to do instead.
As more and more data is available on cloud storage, newcomers to xarray will probably be increasingly looking to use it with remote data. Since xarray already supports this in some cases, this is great! With a few tweaks to docs and error messages, I think we could change an experience that took me multiple hours of debugging and reading the source into an easy 30sec experience for new users.