Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Release V 0.14.2 #250

Merged
merged 34 commits into from
Mar 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
34 commits
Select commit Hold shift + click to select a range
ac20db2
Bump ASFHyP3/actions from 0.10.0 to 0.11.0
dependabot[bot] Jan 17, 2024
5cf6733
Delete .trufflehog.txt
jhkennedy Jan 17, 2024
1d60aea
Merge pull request #245 from ASFHyP3/dependabot/github_actions/ASFHyP…
jhkennedy Jan 17, 2024
4fc2594
add an option to upload the opendata
cirrusasf Mar 18, 2024
8af8a54
modify code style
cirrusasf Mar 18, 2024
9ad03cc
refactor the upload open data code
cirrusasf Mar 19, 2024
5a11809
modify code style
cirrusasf Mar 19, 2024
1ff4769
Apply suggestions from code review
forrestfwilliams Mar 19, 2024
af16ede
finish up code review additions
forrestfwilliams Mar 19, 2024
c584df1
mild refactor
forrestfwilliams Mar 19, 2024
587e2ce
correct one statement in process.py
cirrusasf Mar 19, 2024
b97aecd
add test functions in test_process.py
cirrusasf Mar 19, 2024
0787b01
add an example product file in tests/data
cirrusasf Mar 19, 2024
58c7c2d
re-orangenize the workflow
cirrusasf Mar 19, 2024
533eb15
Update CHANGELOG.md
cirrusasf Mar 19, 2024
cbe5d8e
clean up the code
cirrusasf Mar 19, 2024
b2527ac
Merge branch 'auto_upload_its_live_data_bucket' of github.com:ASFHyP3…
cirrusasf Mar 19, 2024
56eb37f
clean up the debug code
cirrusasf Mar 19, 2024
ce20b83
modify code style
cirrusasf Mar 19, 2024
0be1a65
modify code style
cirrusasf Mar 20, 2024
dbf5088
modify code style
cirrusasf Mar 20, 2024
deb80cf
set up access key for open data upload
forrestfwilliams Mar 20, 2024
9415247
add 4 more test cases in the test_point_to_prefix function in test_pr…
cirrusasf Mar 20, 2024
657dce3
Merge pull request #248 from ASFHyP3/access_key
forrestfwilliams Mar 20, 2024
30d6f17
change to a publish-bucket parameter to enable testing
forrestfwilliams Mar 20, 2024
cae7455
update changelog
forrestfwilliams Mar 20, 2024
61879bc
fix flake8
forrestfwilliams Mar 20, 2024
279e62f
fix flake8
forrestfwilliams Mar 20, 2024
a9260ba
Merge pull request #249 from ASFHyP3/bucket_arg
forrestfwilliams Mar 20, 2024
9ad126d
Apply suggestions from code review
cirrusasf Mar 20, 2024
ecdb419
Merge pull request #247 from ASFHyP3/auto_upload_its_live_data_bucket
cirrusasf Mar 20, 2024
553eeab
Refactor publishing open data
jhkennedy Mar 23, 2024
437fdc9
fix function name change missed in refactor
jhkennedy Mar 25, 2024
cf9b2cb
Merge pull request #251 from ASFHyP3/publish-changes
jhkennedy Mar 25, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/changelog.yml
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,6 @@ on:

jobs:
call-changelog-check-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-changelog-check.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-changelog-check.yml@v0.11.0
secrets:
USER_TOKEN: ${{ secrets.GITHUB_TOKEN }}
2 changes: 1 addition & 1 deletion .github/workflows/create-jira-issue.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ on:

jobs:
call-create-jira-issue-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-create-jira-issue.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-create-jira-issue.yml@v0.11.0
secrets:
JIRA_BASE_URL: ${{ secrets.JIRA_BASE_URL }}
JIRA_USER_EMAIL: ${{ secrets.JIRA_USER_EMAIL }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/labeled-pr.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,4 +12,4 @@ on:

jobs:
call-labeled-pr-check-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-labeled-pr-check.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-labeled-pr-check.yml@v0.11.0
2 changes: 1 addition & 1 deletion .github/workflows/release-template-comment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,6 @@ on:

jobs:
call-release-checklist-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-release-checklist-comment.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-release-checklist-comment.yml@v0.11.0
secrets:
USER_TOKEN: ${{ secrets.GITHUB_TOKEN }}
2 changes: 1 addition & 1 deletion .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ on:

jobs:
call-release-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-release.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-release.yml@v0.11.0
with:
release_prefix: HyP3 autoRIFT
secrets:
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/static-analysis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,10 @@ on: push

jobs:
call-flake8-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-flake8.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-flake8.yml@v0.11.0
with:
local_package_names: hyp3_autorift
excludes: src/hyp3_autorift/vend

call-secrets-analysis-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-secrets-analysis.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-secrets-analysis.yml@v0.11.0
2 changes: 1 addition & 1 deletion .github/workflows/tag-version.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,6 @@ on:

jobs:
call-bump-version-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-bump-version.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-bump-version.yml@v0.11.0
secrets:
USER_TOKEN: ${{ secrets.TOOLS_BOT_PAK }}
6 changes: 3 additions & 3 deletions .github/workflows/test-and-build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -12,18 +12,18 @@ on:

jobs:
call-pytest-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-pytest.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-pytest.yml@v0.11.0
with:
local_package_name: hyp3_autorift
python_versions: >-
["3.9"]

call-version-info-workflow:
uses: ASFHyP3/actions/.github/workflows/reusable-version-info.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-version-info.yml@v0.11.0

call-docker-ghcr-workflow:
needs: call-version-info-workflow
uses: ASFHyP3/actions/.github/workflows/reusable-docker-ghcr.yml@v0.10.0
uses: ASFHyP3/actions/.github/workflows/reusable-docker-ghcr.yml@v0.11.0
with:
version_tag: ${{ needs.call-version-info-workflow.outputs.version_tag }}
secrets:
Expand Down
3 changes: 0 additions & 3 deletions .trufflehog.txt

This file was deleted.

4 changes: 4 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,10 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [PEP 440](https://www.python.org/dev/peps/pep-0440/)
and uses [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.15.0]
### Added
* `--publish-bucket` option has been added to the HyP3 entry point to additionally publish products an AWS bucket, such as the ITS_LIVE AWS Open Data bucket, `s3://its-live-data`.
* `upload_file_to_s3_with_publish_access_keys` to perform S3 uploads using credentials from the `PUBLISH_ACCESS_KEY_ID` and `PUBLISH_SECRET_ACCESS_KEY` environment vairables.

## [0.14.1]
### Changed
Expand Down
65 changes: 63 additions & 2 deletions src/hyp3_autorift/process.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@

from hyp3_autorift import geometry, image, io
from hyp3_autorift.crop import crop_netcdf_product
from hyp3_autorift.utils import get_esa_credentials
from hyp3_autorift.utils import get_esa_credentials, upload_file_to_s3_with_publish_access_keys

log = logging.getLogger(__name__)

Expand All @@ -48,6 +48,16 @@
DEFAULT_PARAMETER_FILE = '/vsicurl/http://its-live-data.s3.amazonaws.com/' \
'autorift_parameters/v001/autorift_landice_0120m.shp'

PLATFORM_SHORTNAME_LONGNAME_MAPPING = {
'S1': 'sentinel1',
'S2': 'sentinel2',
'L4': 'landsatOLI',
'L5': 'landsatOLI',
'L7': 'landsatOLI',
'L8': 'landsatOLI',
'L9': 'landsatOLI',
}


def get_lc2_stac_json_key(scene_name: str) -> str:
platform = get_platform(scene_name)
Expand Down Expand Up @@ -319,6 +329,48 @@ def apply_landsat_filtering(reference_path: str, secondary_path: str) \
return reference_path, reference_zero_path, secondary_path, secondary_zero_path


def get_lat_lon_from_ncfile(ncfile: Path) -> Tuple[float, float]:
with Dataset(ncfile) as ds:
var = ds.variables['img_pair_info']
return var.latitude, var.longitude


def point_to_region(lat: float, lon: float) -> str:
"""
Returns a string (for example, N78W124) of a region name based on
granule center point lat,lon
"""
nw_hemisphere = 'N' if lat >= 0.0 else 'S'
ew_hemisphere = 'E' if lon >= 0.0 else 'W'

region_lat = int(10*np.trunc(np.abs(lat/10.0)))
if region_lat == 90: # if you are exactly at a pole, put in lat = 80 bin
region_lat = 80

region_lon = int(10*np.trunc(np.abs(lon/10.0)))

if region_lon >= 180: # if you are at the dateline, back off to the 170 bin
region_lon = 170

return f'{nw_hemisphere}{region_lat:02d}{ew_hemisphere}{region_lon:03d}'


def get_opendata_prefix(file: Path):
# filenames have form GRANULE1_X_GRANULE2
scene = file.name.split('_X_')[0]

platform_shortname = get_platform(scene)
lat, lon = get_lat_lon_from_ncfile(file)
region = point_to_region(lat, lon)

return '/'.join([
'velocity_image_pair',
PLATFORM_SHORTNAME_LONGNAME_MAPPING[platform_shortname],
'v02',
region
])


def process(
reference: str,
secondary: str,
Expand Down Expand Up @@ -520,6 +572,9 @@ def main():
)
parser.add_argument('--bucket', help='AWS bucket to upload product files to')
parser.add_argument('--bucket-prefix', default='', help='AWS prefix (location in bucket) to add to product files')
parser.add_argument('--publish-bucket', default='',
help='Additionally, publish products to this bucket. Necessary credentials must be provided '
'via the `PUBLISH_ACCESS_KEY_ID` and `PUBLISH_SECRET_ACCESS_KEY` environment variables.')
parser.add_argument('--esa-username', default=None, help="Username for ESA's Copernicus Data Space Ecosystem")
parser.add_argument('--esa-password', default=None, help="Password for ESA's Copernicus Data Space Ecosystem")
parser.add_argument('--parameter-file', default=DEFAULT_PARAMETER_FILE,
Expand All @@ -538,9 +593,15 @@ def main():
g1, g2 = sorted(args.granules, key=get_datetime)

product_file, browse_file = process(g1, g2, parameter_file=args.parameter_file, naming_scheme=args.naming_scheme)
thumbnail_file = create_thumbnail(browse_file)

if args.bucket:
upload_file_to_s3(product_file, args.bucket, args.bucket_prefix)
upload_file_to_s3(browse_file, args.bucket, args.bucket_prefix)
thumbnail_file = create_thumbnail(browse_file)
upload_file_to_s3(thumbnail_file, args.bucket, args.bucket_prefix)

if args.publish_bucket:
prefix = get_opendata_prefix(product_file)
upload_file_to_s3_with_publish_access_keys(product_file, args.publish_bucket, prefix)
upload_file_to_s3_with_publish_access_keys(browse_file, args.publish_bucket, prefix)
upload_file_to_s3_with_publish_access_keys(thumbnail_file, args.publish_bucket, prefix)
26 changes: 26 additions & 0 deletions src/hyp3_autorift/utils.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,13 @@
import logging
import netrc
import os
from pathlib import Path
from platform import system
from typing import Tuple

import boto3
from hyp3lib.aws import get_content_type, get_tag_set


ESA_HOST = 'dataspace.copernicus.eu'

Expand All @@ -28,3 +32,25 @@ def get_esa_credentials() -> Tuple[str, str]:
"Please provide Copernicus Data Space Ecosystem (CDSE) credentials via the "
"ESA_USERNAME and ESA_PASSWORD environment variables, or your netrc file."
)


def upload_file_to_s3_with_publish_access_keys(path_to_file: Path, bucket: str, prefix: str = ''):
try:
access_key_id = os.environ['PUBLISH_ACCESS_KEY_ID']
access_key_secret = os.environ['PUBLISH_SECRET_ACCESS_KEY']
except KeyError:
raise ValueError(
'Please provide S3 Bucket upload access key credentials via the '
'PUBLISH_ACCESS_KEY_ID and PUBLISH_SECRET_ACCESS_KEY environment variables'
)

s3_client = boto3.client('s3', aws_access_key_id=access_key_id, aws_secret_access_key=access_key_secret)
key = str(Path(prefix) / path_to_file.name)
extra_args = {'ContentType': get_content_type(key)}

logging.info(f'Uploading s3://{bucket}/{key}')
s3_client.upload_file(str(path_to_file), bucket, key, extra_args)

tag_set = get_tag_set(path_to_file.name)

s3_client.put_object_tagging(Bucket=bucket, Key=key, Tagging=tag_set)
Binary file not shown.
25 changes: 25 additions & 0 deletions tests/test_process.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
import io
from datetime import datetime
from pathlib import Path
from re import match
from unittest import mock
from unittest.mock import MagicMock, patch
Expand Down Expand Up @@ -415,3 +416,27 @@ def mock_apply_filter_function(scene, _):
process.apply_landsat_filtering('LT04', 'LE07')
assert process.apply_landsat_filtering('LT04', 'LT05') == ('LT04', None, 'LT05', None)
assert process.apply_landsat_filtering('LT04', 'LT04') == ('LT04', None, 'LT04', None)


def test_point_to_prefix():
assert process.point_to_region(63.0, 128.0) == 'N60E120'
assert process.point_to_region(-63.0, 128.0) == 'S60E120'
assert process.point_to_region(63.0, -128.0) == 'N60W120'
assert process.point_to_region(-63.0, -128.0) == 'S60W120'
assert process.point_to_region(0.0, 128.0) == 'N00E120'
assert process.point_to_region(0.0, -128.0) == 'N00W120'
assert process.point_to_region(63.0, 0.0) == 'N60E000'
assert process.point_to_region(-63.0, 0.0) == 'S60E000'
assert process.point_to_region(0.0, 0.0) == 'N00E000'


def test_get_lat_lon_from_ncfile():
file = Path('tests/data/'
'LT05_L1GS_219121_19841206_20200918_02_T2_X_LT05_L1GS_226120_19850124_20200918_02_T2_G0120V02_P000.nc')
assert process.get_lat_lon_from_ncfile(file) == (-81.49, -128.28)


def test_get_opendata_prefix():
file = Path('tests/data/'
'LT05_L1GS_219121_19841206_20200918_02_T2_X_LT05_L1GS_226120_19850124_20200918_02_T2_G0120V02_P000.nc')
assert process.get_opendata_prefix(file) == 'velocity_image_pair/landsatOLI/v02/S80W120'
18 changes: 17 additions & 1 deletion tests/test_utils.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import pytest

from hyp3_autorift.utils import ESA_HOST, get_esa_credentials
from hyp3_autorift.utils import ESA_HOST, get_esa_credentials, upload_file_to_s3_with_publish_access_keys


def test_get_esa_credentials_env(tmp_path, monkeypatch):
Expand Down Expand Up @@ -45,3 +45,19 @@ def test_get_esa_credentials_missing(tmp_path, monkeypatch):
msg = 'Please provide.*'
with pytest.raises(ValueError, match=msg):
get_esa_credentials()


def test_upload_file_to_s3_credentials_missing(tmp_path, monkeypatch):
with monkeypatch.context() as m:
m.delenv('PUBLISH_ACCESS_KEY_ID', raising=False)
m.setenv('PUBLISH_SECRET_ACCESS_KEY', 'publish_access_key_secret')
msg = 'Please provide.*'
with pytest.raises(ValueError, match=msg):
upload_file_to_s3_with_publish_access_keys('file.zip', 'myBucket')

with monkeypatch.context() as m:
m.setenv('PUBLISH_ACCESS_KEY_ID', 'publish_access_key_id')
m.delenv('PUBLISH_SECRET_ACCESS_KEY', raising=False)
msg = 'Please provide.*'
with pytest.raises(ValueError, match=msg):
upload_file_to_s3_with_publish_access_keys('file.zip', 'myBucket')
Loading