Skip to content

Commit

Permalink
Merge pull request #8 from ASFHyP3/its-live
Browse files Browse the repository at this point in the history
Add ITS_LIVE dataset
  • Loading branch information
jhkennedy authored Jan 23, 2024
2 parents 4bd9ba2 + 73c0a87 commit 580f1a7
Show file tree
Hide file tree
Showing 6 changed files with 111 additions and 8 deletions.
1 change: 0 additions & 1 deletion .github/workflows/deploy-asf-event-data-files.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,6 @@ on:
- asf-event-data/README.md
- shared/index.html


jobs:
deploy-asf-event-data-files:
runs-on: ubuntu-latest
Expand Down
45 changes: 45 additions & 0 deletions .github/workflows/deploy-its-live-data-files.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,45 @@
on:
push:
branches:
- main
paths:
- its-live-data/README.md
- shared/index.html

jobs:
deploy-its-live-data-files:
runs-on: ubuntu-latest
environment:
name: its-live-data
url: https://its-live-data.s3.us-west-2.amazonaws.com/README.html

steps:
- uses: actions/checkout@v4

# To test, run:
# docker run -it --rm -v ${PWD}:/github/workspace \
# -e INPUT_INPUT_PATH=its-live-data/README.md \
# -e INPUT_OUTPUT_DIR=its-live-data/ \
# -e INPUT_BUILD_PDF=false \
# ghcr.io/baileyjm02/markdown-to-pdf/markdown-to-pdf:latest
- name: Create its-live-data/README.html
uses: baileyjm02/markdown-to-pdf@v1
with:
input_path: its-live-data/README.md
output_dir: its-live-data/
build_pdf: false

- name: Create its-live-data/index.html
run: |
sed "165s/\[OPENDATA_BUCKET_NAME\]/its-live-data/" shared/index.html > its-live-data/index.html
- uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ secrets.AWS_REGION }}

- name: Copy README and Index to S3
run: |
aws s3 cp ./its-live-data/README.html s3://its-live-data/README.html
aws s3 cp ./its-live-data/index.html s3://its-live-data/index.html
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ asf-event-data/*.html
glo-30-hand/*.html
glo-30-hand/*.vrt
glo-30-hand/*.tif

its-live-data/*.html

# Created by https://www.toptal.com/developers/gitignore/api/python,vim,jetbrains,jupyternotebook
# Edit at https://www.toptal.com/developers/gitignore?templates=python,vim,jetbrains,jupyternotebook
Expand Down
16 changes: 14 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

This is a repository to keep a record of and track how we provide and manage data on AWS OpenData.

Below is a list of datasets we currently maintain. For adding a new dataset, see: [add-a-new-dataset.md](docs/add-a-new-dataset.md).

## Datasets

### [Disaster Events](asf-event-data)
Expand All @@ -16,14 +18,24 @@ On a merge to main, [deploy-asf-event-data-files.yml](.github/workflows/deploy-a
* `asf-event-data/README.html`, which is automatically created from [`asf-event-data/README.md`](asf-event-data/README.md)
* `asf-event-data/index.html`, which is automatically created from [`shared/index.html`](shared/index.html)

### HAND
### [HAND](glo-30-hand)

Height Above Nearest Drainage (HAND) is a terrain model that normalizes topography to the relative heights along the drainage network and is used to describe the relative soil gravitational potentials or the local drainage potentials. Each pixel value represents the vertical distance to the nearest drainage. The HAND data provides near-worldwide land coverage at 30 meters and was produced from the 2021 release of the Copernicus GLO-30 Public DEM as distributed in the Registry of Open Data on AWS.

This dataset is hosted in the `s3://glo-30-hand` bucket on AWS. For more information, see: [the listing on AWS OpenData](FIXME), which is manged by [this YAML](FIXME)
This dataset is hosted in the `s3://glo-30-hand` bucket on AWS. For more information, see: [the listing on AWS OpenData](https://registry.opendata.aws/glo-30-hand/), which is manged by [this YAML](https://github.com/awslabs/open-data-registry/blob/main/datasets/glo-30-hand.yaml).

#### Management

On a merge to main, [deploy-glo-30-hand-files.yml](.github/workflows/deploy-glo30-hand-files.yml) will upload to the `s3://glo-30-hand` bucket:
* `glo-30-hand/readme.html`, which is automatically created from [`glo-30-hand/readme.md`](glo-30-hand/readme.md)
* `glo-30-hand/index.html`, which is automatically created from [`shared/index.html`](shared/index.html)

### [ITS_LIVE](its-live-data)

The Inter-mission Time Series of Land Ice Velocity and Elevation (ITS-LIVE) project has a singular mission: to accelerate ice sheet and glacier research by producing globally comprehensive, high resolution, low latency, temporally dense, multi-sensor records of land ice and ice shelf change, while minimizing barriers between the data and the user.

This dataset is hosted in the `s3://its-live-data` bucket on AWS. For more information, more information, see: [the listing on AWS OpenData](https://registry.opendata.aws/its-live-data/), which is manged by [this YAML](https://github.com/awslabs/open-data-registry/blob/main/datasets/its-live-data.yaml).

On a merge to main, [deploy-its-live-data-files.yml](.github/workflows/deploy-its-live-data-files.yml) will upload to the `s3://its-live-data` bucket:
* `its-live-data/README.html`, which is automatically created from [`its-live-data/README.md`](its-live-data/README.md)
* `its-live-data/index.html`, which is automatically created from [`shared/index.html`](shared/index.html)
8 changes: 4 additions & 4 deletions docs/add-a-new-dataset.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,11 +31,11 @@ aws cloudformation deploy --profile ${AWS_PROFILE} \
--parameter-overrides OpenDataBucketName=${OPENDATA_BUCKET}
```
>[!IMPORTANT]
> Note: This stack should only be deployed once per AWS account.
> Note: This stack should only be deployed once per AWS account. It is also a good idea to enable termination protection.
After the stack is created you'll need to create an AWS CLI access key for the `github-actions` user, which you will use in the next step.

### GitGub Actions Environment
### GitHub Actions Environment

We use a GitHub Actions Environment for each datasett to store the AWS access credentials necessary to deploy the common files.

Expand All @@ -49,6 +49,6 @@ and name it the same as the S3 bucket (`${OPENDATA_BUCKET}`). When configuring t
* `AWS_SECRET_ACCESS_KEY`
* `AWS_REGION` (typically `us-west-2`)

### Deploy action
### Deployment action

Now, using [deploy-asf-event-data-files.yml](../.github/workflows/deploy-asf-event-data-files.yml) as a template, create a deploy action for your dataset.
Now, using [deploy-asf-event-data-files.yml](../.github/workflows/deploy-asf-event-data-files.yml) as a template, create a deployment action for your dataset.
47 changes: 47 additions & 0 deletions its-live-data/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# Inter-mission Time Series of Land Ice Velocity and Elevation (ITS-LIVE) dataset

The Inter-mission Time Series of Land Ice Velocity and Elevation (ITS-LIVE) project has a singular mission: to accelerate ice sheet and glacier research by producing globally comprehensive, high resolution, low latency, temporally dense, multi-sensor records of land ice and ice shelf change, while minimizing barriers between the data and the user.

ITS-LIVE data consists of:
* NetCDF Level-2 scenes-pair ice flow products posted to a standard 120 m grid derived from Landsat 4/5/7/8/9 and Sentinel-2 optical scenes, and Sentinel-1 SAR scenes.
* A set of Zarr datacubes containing all scene pair data cloud-optimized for time-series analysis, with a suite of user tools available to effectively query the data.
* A suite of Level 3 products, including:
* regional and ice-sheet wide mosaics calculated monthly and annually as Cloud-Optimized GeoTIFFs

For more information about the ITS-LIVE project, please see <https://its-live.jpl.nasa.gov/>.

## Accessing the ITS-LIVE data

The ITS-LIVE data is all stored is the public `its-live-data` AWS S3 bucket, which is located in the `us-west-2` (Oregon) AWS Region, and organized under a collection of prefixes (folders) to ease access:

* `autorift_parameters/`: A collection of [autoRIFT](https://github.com/nasa-jpl/autoRIFT/) input parameter files used by the ITS_LIVE project to product the netCDF velocity image pairs
* `catalog_geojson/`: GeoJSON catalog of the NetCDF velocity image pairs
* `composites/`: Zarr mean annual velocities derived from the Zarr Datacubes
* `datacubes/`: Zarr DataCubes of merged image velocity data which have been cloud-optimized for time-series analysis
* `mosaics/`: NetCDF regionally compiled, mean annual surface velocities for major glacier-covered regions derived from the Zarr Datacubes
* `rgb_mosaics/`: Cloud-optimized GeoTIFF images derived from the NetCDF mosaics for easy-use in GIS applications
* `vel_web_tiles/`: [Tiled web map](https://en.wikipedia.org/wiki/Tiled_web_map) PNG images derived from the NetCDF mosaics for easy-use in web applications
* `velocity_image_pair/`: NetCDF velocity images derived from optical and SAR satellite image pairs using [autoRIFT](https://github.com/nasa-jpl/autoRIFT/)

To list all the top-level bucket prefixes, run:

```shell
aws s3 ls s3://its-live-data/
```

> [CAUTION!]
> There are over 1 billion objects in the `its-live-data` bucket. Recursively listing the whole bucket is **not** recommended.
## Contact

If you have questions about the data itself or the processing methods used, please post in the [ITS_LIVE Community gitter](https://app.gitter.im/#/room/#its_live_community:gitter.im).

If you have question about how the data is managed on AWS, please email the [ASF Tools Team](mailto:uaf-asf-apd@alaska.edu).

## License

The use of the ITS-LIVE data falls under the terms and conditions of the [Creative Commons Zero (CC0) 1.0 Universal](https://creativecommons.org/publicdomain/zero/1.0/) license.

---

[AWS Public Datasets](http://aws.amazon.com/public-datasets)

0 comments on commit 580f1a7

Please sign in to comment.