-
Notifications
You must be signed in to change notification settings - Fork 20
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Load HDF-Files with odc.stac.load #114
Comments
@muelj12 please share full stack trace and version of If you can, please share stac item json as well. This is probably a dupe of #107, but there could also be issues with |
@Kirill888
And here the full stack trace:
I also tried this call I know it is a special use case, but we are facing storage problems and don't want to create COG-tiffs from the hdf file right now. {
"id": "MOD13Q1.A2022145.h35v10.061.2022168103104",
"bbox": [
172.479315908761,
-19.180676222654,
-179.856407114504,
-9.97534112170732
],
"type": "Feature",
"links": [
{
"rel": "collection",
"type": "application/json",
"href": "-MyURL-/api/collections/modis-13q1-061"
},
{
"rel": "parent",
"type": "application/json",
"href": "-MyURL-/api/collections/modis-13q1-061"
},
{
"rel": "root",
"type": "application/json",
"href": "-MyURL-/api/"
},
{
"rel": "self",
"type": "application/geo+json",
"href": "-MyURL-/api/collections/modis-13q1-061/items/MOD13Q1.A2022145.h35v10.061.2022168103104"
}
],
"assets": {
"hdf": {
"href": "-MyFilepath-/MOD13Q1.A2022145.h35v10.061.2022168103104.hdf",
"type": "application/x-hdf",
"roles": [
"data"
],
"title": "Source data containing all bands"
},
"metadata": {
"href": "-MyFilepath-/MOD13Q1.A2022145.h35v10.061.2022168103104.hdf.xml",
"type": "application/xml",
"roles": [
"metadata"
],
"title": "Federal Geographic Data Committee (FGDC) Metadata"
}
},
"geometry": {
"type": "MultiPolygon",
"coordinates": [
[
[
[
180,
-19.18040533808951
],
[
180,
-10.007120826311045
],
[
179.999502554976,
-9.97534112170732
],
[
172.622770159185,
-9.98364248250805
],
[
172.479315908761,
-19.1662177463598
],
[
180,
-19.18040533808951
]
]
],
[
[
[
-180,
-10.007120826311045
],
[
-180,
-19.18040533808951
],
[
-179.856407114504,
-19.180676222654
],
[
-180,
-10.007120826311045
]
]
]
]
},
"collection": "modis-13q1-061",
"properties": {
"created": "2023-05-03T02:02:50.232600Z",
"updated": "2022-06-17T09:38:16.263000Z",
"datetime": null,
"platform": "terra",
"instruments": [
"modis"
],
"end_datetime": "2022-06-09T23:59:59Z",
"modis:tile-id": "51035010",
"start_datetime": "2022-05-25T00:00:00Z",
"modis:vertical-tile": 10,
"modis:horizontal-tile": 35
},
"stac_extensions": [],
"stac_version": "1.0.0"
}
|
This change is not in any release yet:
EDIT: nope.. separate issue. |
- Add pass-list of non-image media types - Extend extension list
- Add pass-list of non-image media types - Extend extension list
@muelj12 #116 fixes detection of raster assets for hdf like sources. Would you be able to try this code and report next place this breaks? I suspect that your asset definition will need to be expanded to include subdataset information, not sure how this is meant to be done in STAC. And most likely data loading code will need to be tweaked to construct appropriate uri for GDAL when input has subdatasets. |
@Kirill888 how can i install a odc.stac version with the fixed version? |
@muelj12 This should work:
|
@Kirill888
Same with |
Just add When STAC is missing projection extension we can not guess appropriate resolution without looking at the raster data file first. |
|
It is also possible that Supporting hdf/netcdf/zarr inputs would be nice, but I'm certain that this will require some extra development work on the Have you tried reading your data with |
Hello together,
i currently try to load HDF files from a not published STAC-Collection (so i can't show you the collection) with odc.stac.load and receive a error message:
MODIS_13Q1_061_sampel = stac_load( stac_items)
So my question is: can STAC items that are available as HDF files be loaded with odc.stac.load at all? If yes, is there an example?
The text was updated successfully, but these errors were encountered: