Releases: xcube-dev/xcube
1.4.0
Enhancements
-
Added new
reference
filesystem data store to support "kerchunked" NetCDF files in object storage. (#928)See also
-
Improved xcube Server's STAC API:
- Provide links for multiple coverages data formats
- Add
crs
andcrs_storage
properties to STAC data - Add spatial and temporal grid data to collection descriptions
- Add a schema endpoint returning a JSON schema of a dataset's data
variables - Add links to domain set, range type, and range schema to collection
descriptions
-
Improved xcube Server's Coverages API:
- Support scaling parameters
scale-factor
,scale-axes
, andscale-size
- Improve handling of bbox parameters
- Handle half-open datetime intervals
- More robust and standard-compliant parameter parsing and checking
- More informative responses for incorrect or unsupported parameters
- Omit unnecessary dimensions in TIFF and PNG coverages
- Use crs_wkt when determining CRS, if present and needed
- Change default subsetting and bbox CRS from EPSG:4326 to OGC:CRS84
- Implement reprojection for bbox
- Ensure datetime parameters match dataset’s timezone awareness
- Reimplement subsetting (better standards conformance, cleaner code)
- Set Content-Bbox and Content-Crs headers in the HTTP response
- Support safe CURIE syntax for CRS specification
- Support scaling parameters
Fixes
- Fixed
KeyError: 'lon_bnds'
raised occasionally when opening
(mostly NetCDF) datasets. (#930) - Make S3 unit tests compatible with moto 5 server. (#922)
- Make some CLI unit tests compatible with pytest 8. (#922)
- Rename some test classes to avoid spurious warnings. (#924)
Other changes
- Require Python >=3.9 (previously >=3.8)
Full Changelog: v1.3.1...v1.4.0
1.3.1
1.3.0
Changes in 1.3.0
Enhancements
- Added a basic implementation of the draft version of OGC API - Coverages.
(#879, #889, #900) - Adapted the STAC implementation to additionally offer datasets as
individual collections for better integration with OGC API - Coverages.
(#889) - Various minor improvements to STAC implementation. (#900)
Fixes
- Resolved the issue for CRS84 error due to latest version of gdal (#869)
- Fixed incorrect additional variable data in STAC datacube properties. (#889)
- Fixed access of geotiff datasets from public s3 buckets (#893)
Other changes
update_dataset_attrs
can now also handle datasets with CRS other than
WGS84 and update the metadata according to the
ESIP Attribute Convention for Data Discovery.- removed deprecated module xcube edit, which has been deprecated since
version 0.13.0 - Update "Development process" section of developer guide.
- Updated GitHub workflow to build docker image for GitHub releases only and
not on each commit to master.
1.2.0
-
Added a new, experimental
/compute
API to xcube server.
It comprises the following endpoints:GET compute/operations
- List available operations.GET compute/operations/{opId}
- Get details of a given operation.PUT compute/jobs
- Start a new job that executes an operation.GET compute/jobs
- Get all jobs.GET compute/jobs/{jobId}
- Get details of a given job.DELETE compute/jobs/{jobId}
- Cancel a given job.
The available operations are currently taken from module
xcube.webapi.compute.operations
.To disable the new API use the following server configuration:
api_spec: excludes: ["compute"] ...
Other changes
- Added
shutdown_on_close=True
parameter to coiled params to ensure that the
clusters are shut down on close. (#881) - Introduced new parameter
region
for utility functionnew_cluster
in
xcube.util.dask
which will ensure coiled creates the dask cluster in the
prefered default region: eu-central-1. (#882) - Server offers the function
add_place_group
inplaces/context.py
,
which allows plugins to add place groups from external sources.
1.1.2
Changes in 1.1.2
Fixes
- Fixed issue where geotiff access from a protected s3 bucket was denied (#863)
Changes in 1.1.1
- Bundled new build of xcube-viewer 1.1.0.1
that will correctly respect a given xcube server from loaded from the
viewer configuration.
Changes in 1.1.0
Enhancements
-
Bundled xcube-viewer 1.1.0.
-
Updated installation instructions (#859)
-
Included support for FTP filesystem by adding a new data store
ftp
.These changes will enable access to data cubes (
.zarr
or.levels
)
in FTP storage as shown here:store = new_data_store( "ftp", # FTP filesystem protocol root="path/to/files", # Path on FTP server storage_options= {'host': 'ftp.xxx', # The url to the ftp server 'port': 21 # Port, defaults to 21 # Optionally, use # 'username': 'xxx' # 'password': 'xxx'} ) store.list_data_ids()
Note that there is no anon parameter, as the store will assume no anonymity
if no username and password are set.Same configuration for xcube Server:
DataStores: - Identifier: siec StoreId: ftp StoreParams: root: my_path_on_the_host max_depth: 1 storage_options: host: "ftp.xxx" port: xxx username: "xxx" password': "xxx"
-
Updated xcube Dataset Specification.
(addressing #844) -
Added xcube Data Access documentation.
Fixes
-
Fixed various issues with the auto-generated Python API documentation.
-
Fixed a problem where time series requests may have missed outer values
of a requested time range. (#860)- Introduced query parameter
tolerance
for
endpoint/timeseries/{datasetId}/{varName}
which is
the number of seconds by which the given time range is expanded. Its
default value is one second to overcome rounding problems with
microsecond fractions. (#860) - We now round the time dimension labels for a dataset as
follows (rounding frequency is 1 second by default):- First times stamp:
floor(time[0])
- Last times stamp:
ceil(time[-1])
- In-between time stamps:
round(time[1: -1])
- First times stamp:
- Introduced query parameter
Other changes
- Pinned
gdal
dependency to>=3.0, <3.6.3
due to incompatibilities.
1.1.1
Changes in 1.1.1
- Bundled new build of xcube-viewer 1.1.0 that will correctly respect a given xcube server from loaded from the viewer configuration.
Full Changelog: v1.1.0...v1.1.1
Changes in 1.1.0
Enhancements
-
Bundled xcube-viewer 1.1.0.
-
Updated installation instructions (#859)
-
Included support for FTP filesystem by adding a new data store
ftp
.These changes will enable access to data cubes (
.zarr
or.levels
)
in FTP storage as shown here:store = new_data_store( "ftp", # FTP filesystem protocol root="path/to/files", # Path on FTP server storage_options= {'host': 'ftp.xxx', # The url to the ftp server 'port': 21 # Port, defaults to 21 # Optionally, use # 'username': 'xxx' # 'password': 'xxx'} ) store.list_data_ids()
Note that there is no anon parameter, as the store will assume no anonymity
if no username and password are set.Same configuration for xcube Server:
DataStores: - Identifier: siec StoreId: ftp StoreParams: root: my_path_on_the_host max_depth: 1 storage_options: host: "ftp.xxx" port: xxx username: "xxx" password': "xxx"
-
Updated xcube Dataset Specification.
(addressing #844) -
Added xcube Data Access documentation.
Fixes
-
Fixed various issues with the auto-generated Python API documentation.
-
Fixed a problem where time series requests may have missed outer values
of a requested time range. (#860)- Introduced query parameter
tolerance
for
endpoint/timeseries/{datasetId}/{varName}
which is
the number of seconds by which the given time range is expanded. Its
default value is one second to overcome rounding problems with
microsecond fractions. (#860) - We now round the time dimension labels for a dataset as
follows (rounding frequency is 1 second by default):- First times stamp:
floor(time[0])
- Last times stamp:
ceil(time[-1])
- In-between time stamps:
round(time[1: -1])
- First times stamp:
- Introduced query parameter
Other changes
- Pinned
gdal
dependency to>=3.0, <3.6.3
due to incompatibilities.
Full Changelog: v1.0.5...v1.1.0
1.1.0
Changes in 1.1.0
Enhancements
-
Bundled xcube-viewer 1.1.0.
-
Updated installation instructions (#859)
-
Included support for FTP filesystem by adding a new data store
ftp
.These changes will enable access to data cubes (
.zarr
or.levels
)
in FTP storage as shown here:store = new_data_store( "ftp", # FTP filesystem protocol root="path/to/files", # Path on FTP server storage_options= {'host': 'ftp.xxx', # The url to the ftp server 'port': 21 # Port, defaults to 21 # Optionally, use # 'username': 'xxx' # 'password': 'xxx'} ) store.list_data_ids()
Note that there is no anon parameter, as the store will assume no anonymity
if no username and password are set.Same configuration for xcube Server:
DataStores: - Identifier: siec StoreId: ftp StoreParams: root: my_path_on_the_host max_depth: 1 storage_options: host: "ftp.xxx" port: xxx username: "xxx" password': "xxx"
-
Updated xcube Dataset Specification.
(addressing #844) -
Added xcube Data Access documentation.
Fixes
-
Fixed various issues with the auto-generated Python API documentation.
-
Fixed a problem where time series requests may have missed outer values
of a requested time range. (#860)- Introduced query parameter
tolerance
for
endpoint/timeseries/{datasetId}/{varName}
which is
the number of seconds by which the given time range is expanded. Its
default value is one second to overcome rounding problems with
microsecond fractions. (#860) - We now round the time dimension labels for a dataset as
follows (rounding frequency is 1 second by default):- First times stamp:
floor(time[0])
- Last times stamp:
ceil(time[-1])
- In-between time stamps:
round(time[1: -1])
- First times stamp:
- Introduced query parameter
Other changes
- Pinned
gdal
dependency to>=3.0, <3.6.3
due to incompatibilities.
Full Changelog: v1.0.5...v1.1.0
1.0.6.dev1
Changes in 1.0.6 (in development)
- Bundled xcube-viewer 1.1.0-dev.1.
1.0.5
Changes in 1.0.5
-
When running xcube in a JupyterLab, the class
xcube.webapi.viewer.Viewer
can be used to programmatically launch a xcube Viewer UI. The class now recognizes an environment variableXCUBE_JUPYTER_LAB_URL
that contains a JupyterLab's public base URL for a given user. To work properly, the jupyter-server-proxy extension must be installed and enabled. -
Bundled xcube-viewer 1.0.2.1.
1.0.4
Changes in 1.0.4
-
Setting a dataset's
BoundingBox
in the server configuration
is now recognised when requesting the dataset details. (#845) -
It is now possible to enforce the order of variables reported by
xcube server. The new server configuration keyVariables
can be added
toDatasets
configurations. Is a list of wildcard patterns that
determines the order of variables and the subset of variables to be
reported. (#835) -
Pinned Pandas dependency to lower than 2.0 because of incompatibility
with both xarray and xcube
(see pydata/xarray#7716).
Therefore, the following xcube deprecations have been introduced:- The optional
--base/-b
of thexcube resample
CLI tool. - The keyword argument
base
of thexcube.core.resample.resample_in_time
function.
- The optional
-
Bundled xcube-viewer 1.0.2.
Full Changelog: v1.0.3...v1.0.4