Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update QUEST and GenQuery classes for argo integration #441

Merged
merged 100 commits into from
Sep 25, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
100 commits
Select commit Hold shift + click to select a range
b907af5
Adding argo search and download script
kelseybisson Feb 23, 2022
fb5fc55
Create get_argo.py
kelseybisson Feb 23, 2022
715440b
begin implementing argo dataset
RomiP Feb 28, 2022
d2260d6
1st draft implementing argo dataset
RomiP Mar 8, 2022
71cfedc
implement search_data for physical argo
RomiP Apr 26, 2022
ec564fa
doctests and general cleanup for physical argo query
RomiP Jun 13, 2022
4fb5974
beginning of BGC Argo download
RomiP Jun 23, 2022
3bd2739
parse BGC profiles into DF
RomiP Jun 27, 2022
06835d5
plan to query BGC profiles
RomiP Aug 29, 2022
dac11de
validate BGC param input function
RomiP Sep 6, 2022
dd47dc5
order BGC params in order in which they should be queried
RomiP Sep 12, 2022
88722a1
fix bug in parse_into_df() - init blank df to take in union of params…
RomiP Sep 19, 2022
bf3cd70
identify profiles from initial API request containing all required pa…
RomiP Oct 3, 2022
fcb2422
creates df with only profiles that contain all user specified params
RomiP Oct 24, 2022
eb9c8ae
modified to populate prof df by querying individual profiles
RomiP Nov 21, 2022
1582f5b
finished up BGC argo download!
RomiP Nov 28, 2022
f55fd61
assert bounding box type in Argo init, begin framework for unit tests
RomiP Jan 17, 2023
d6d3872
Adding argo search and download script
kelseybisson Feb 23, 2022
7fc3b79
Create get_argo.py
kelseybisson Feb 23, 2022
195a4f1
begin implementing argo dataset
RomiP Feb 28, 2022
df34424
1st draft implementing argo dataset
RomiP Mar 8, 2022
390b7a9
implement search_data for physical argo
RomiP Apr 26, 2022
6824d27
doctests and general cleanup for physical argo query
RomiP Jun 13, 2022
58092f9
beginning of BGC Argo download
RomiP Jun 23, 2022
ae486f2
parse BGC profiles into DF
RomiP Jun 27, 2022
92f8a0d
plan to query BGC profiles
RomiP Aug 29, 2022
0285be1
validate BGC param input function
RomiP Sep 6, 2022
747af3a
order BGC params in order in which they should be queried
RomiP Sep 12, 2022
cf600c6
fix bug in parse_into_df() - init blank df to take in union of params…
RomiP Sep 19, 2022
29ee8c4
identify profiles from initial API request containing all required pa…
RomiP Oct 3, 2022
934e1a6
creates df with only profiles that contain all user specified params
RomiP Oct 24, 2022
eefcbf8
modified to populate prof df by querying individual profiles
RomiP Nov 21, 2022
55204d8
finished up BGC argo download!
RomiP Nov 28, 2022
0af53d6
assert bounding box type in Argo init, begin framework for unit tests
RomiP Jan 17, 2023
27ab9d7
need to confirm spatial extent is bbox
RomiP Feb 6, 2023
83e0e94
begin test case for available profiles
RomiP Feb 6, 2023
d23b09c
Merge remote-tracking branch 'origin/argo' into argo
RomiP Feb 6, 2023
d96c485
add tests for argo.py
JessicaS11 Feb 6, 2023
4ec53cd
add typing, add example json, and use it to test parsing
JessicaS11 Feb 13, 2023
2594153
Merge branch 'development' into argo
JessicaS11 May 22, 2023
8dfb33e
update argo to submit successful api request (update keys and values …
JessicaS11 May 22, 2023
d43da75
first pass at porting argo over to metadata+per profile download (WIP)
JessicaS11 May 30, 2023
f9c6a82
basic working argo script
JessicaS11 Jun 6, 2023
9c0de9b
simplify parameter validation (ordered list no longer needed)
JessicaS11 Jun 6, 2023
af4d8ce
add option to delete existing data before new download
JessicaS11 Jun 6, 2023
fd18b74
continue cleaning up argo.py
JessicaS11 Jun 6, 2023
df41a98
fix download_by_profile to properly store all downloaded data
JessicaS11 Jun 7, 2023
27b672b
remove old get_argo.py script
JessicaS11 Jun 7, 2023
04e392c
remove _filter_profiles function in favor of submitting data kwarg in…
JessicaS11 Jun 7, 2023
9cc0040
start filling in docstrings
JessicaS11 Jun 7, 2023
d15483b
clean up nearly duplicate functions
JessicaS11 Jun 8, 2023
d877e8b
add more docstrings
JessicaS11 Jun 8, 2023
f9b6d81
get a few minimal argo tests working
JessicaS11 Jun 12, 2023
8fcab13
add bgc argo params. begin adding merge for second download runs
JessicaS11 Jun 20, 2023
aad5053
some changes
RomiP Jun 26, 2023
8fd0083
Merge remote-tracking branch 'origin/argo' into argo
RomiP Jun 26, 2023
630415a
WIP test commit to see if can push to GH
JessicaS11 Jul 7, 2023
fe07540
WIP handling argo merge issue
JessicaS11 Jul 12, 2023
c246543
update profile to df to return df and move merging to get_dataframe
JessicaS11 Jul 20, 2023
ccb8ebf
Merge remote-tracking branch 'origin/argo' into argo
RomiP Jul 28, 2023
5851cb8
Merge remote-tracking branch 'origin/argo' into argo
RomiP Jul 28, 2023
1fd069c
merge profiles with existing df
JessicaS11 Jul 31, 2023
363dad2
clean up docstrings and code
JessicaS11 Jul 31, 2023
63d3b3b
add test_argo.py
RomiP Jul 31, 2023
a91c25f
Merge remote-tracking branch 'origin/argo' into argo
RomiP Jul 31, 2023
4602cdb
add prelim test case for adding to Argo df
RomiP Jul 31, 2023
2cdf07e
remove sandbox files
JessicaS11 Aug 14, 2023
a91c360
remove bgc argo test file
JessicaS11 Aug 14, 2023
cb367e1
update variables notebook from development
JessicaS11 Aug 14, 2023
b89840c
Merge remote-tracking branch 'origin/argo' into argo
RomiP Aug 16, 2023
381092f
simplify import statements
JessicaS11 Aug 16, 2023
283748e
quickfix for granules error
zachghiaccio Aug 18, 2023
7893307
draft subpage on available QUEST datasets
zachghiaccio Aug 18, 2023
949ffee
small reference fix in text
zachghiaccio Aug 18, 2023
7414c85
add reference to top of .rst file
zachghiaccio Aug 18, 2023
7655995
Merge remote-tracking branch 'origin/argo' into argo
RomiP Aug 19, 2023
63e1b57
test argo df merge
RomiP Aug 19, 2023
38cd46f
add functionality to Quest class to pass search criteria to all datasets
RomiP Aug 19, 2023
5b6c65b
Merge pull request #436 from zachghiaccio/argo
zachghiaccio Aug 25, 2023
37d19b6
Merge branch 'development' into argo
JessicaS11 Aug 28, 2023
f5655b4
add functionality to Quest class to pass search criteria to all datasets
RomiP Aug 19, 2023
8b94279
update dataset docstrings; reorder argo.py to match
JessicaS11 Aug 23, 2023
88f8f1f
implement quest search+download for IS2
JessicaS11 Aug 24, 2023
6820575
move spatial and temporal properties from query to genquery
JessicaS11 Aug 28, 2023
efde1f7
add query docstring test for cycles,tracks to test file
JessicaS11 Aug 30, 2023
3aced1f
add quest test module
JessicaS11 Aug 30, 2023
f1e82bd
standardize print outputs for quest search and download; is2 download…
JessicaS11 Aug 30, 2023
493adff
remove extra files from this branch
JessicaS11 Aug 30, 2023
5a65517
comment out argo portions of quest for PR
JessicaS11 Aug 30, 2023
4a984e2
remove argo-branch-only init file
JessicaS11 Aug 30, 2023
b80ccf3
remove argo script from branch
JessicaS11 Aug 30, 2023
a08f59d
remove argo test file from branch
JessicaS11 Aug 30, 2023
1f3f3da
comment out another line of argo stuff
JessicaS11 Aug 30, 2023
866cb48
Update quest.py
kelseybisson Sep 6, 2023
26d7930
Update test_quest.py
kelseybisson Sep 6, 2023
2467e4f
Update dataset.py
kelseybisson Sep 6, 2023
0a8f558
Update quest.py
kelseybisson Sep 6, 2023
cc6da09
Merge remote-tracking branch 'origin/shared_search' into shared_search
RomiP Sep 22, 2023
0de8716
catch error with downloading datasets in Quest; template test case fo…
RomiP Sep 25, 2023
0a9c81c
Merge branch 'development' into shared_search
RomiP Sep 25, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions doc/source/contributing/quest-available-datasets.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
.. _quest_supported_label:

QUEST Supported Datasets
========================

On this page, we outline the datasets that are supported by the QUEST module. Click on the links for each dataset to view information about the API and sensor/data platform used.


List of Datasets
----------------

* `Argo <https://argo.ucsd.edu/data/>`_
* The Argo mission involves a series of floats that are designed to capture vertical ocean profiles of temperature, salinity, and pressure down to ~2000 m. Some floats are in support of BGC-Argo, which also includes data relevant for biogeochemical applications: oxygen, nitrate, chlorophyll, backscatter, and solar irradiance.
* (Link Kelsey's paper here)
* (Link to example workbook here)


Adding a Dataset to QUEST
-------------------------

Want to add a new dataset to QUEST? No problem! QUEST includes a template script (``dataset.py``) that may be used to create your own querying module for a dataset of interest.

Guidelines on how to construct your dataset module may be found here: (link to be added)

Once you have developed a script with the template, you may request for the module to be added to QUEST via Github. Please see the How to Contribute page :ref:`dev_guide_label` for instructions on how to contribute to icepyx.
345 changes: 173 additions & 172 deletions icepyx/core/query.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,9 @@
import icepyx.core.APIformatting as apifmt
from icepyx.core.auth import EarthdataAuthMixin
import icepyx.core.granules as granules
from icepyx.core.granules import Granules as Granules
# QUESTION: why doesn't from granules import Granules work, since granules=icepyx.core.granules?
from icepyx.core.granules import Granules
import icepyx.core.is2ref as is2ref

# QUESTION: why doesn't from granules import Granules as Granules work, since granules=icepyx.core.granules?
# from icepyx.core.granules import Granules
import icepyx.core.spatial as spat
import icepyx.core.temporal as tp
import icepyx.core.validate_inputs as val
Expand Down Expand Up @@ -148,6 +146,177 @@ def __str__(self):
)
return str

# ----------------------------------------------------------------------
# Properties

@property
def temporal(self):
"""
Return the Temporal object containing date/time range information for the query object.

See Also
--------
temporal.Temporal.start
temporal.Temporal.end
temporal.Temporal

Examples
--------
>>> reg_a = GenQuery([-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> print(reg_a.temporal)
Start date and time: 2019-02-20 00:00:00
End date and time: 2019-02-28 23:59:59

>>> reg_a = GenQuery([-55, 68, -48, 71],cycles=['03','04','05','06','07'], tracks=['0849','0902'])
>>> print(reg_a.temporal)
['No temporal parameters set']
"""

if hasattr(self, "_temporal"):
return self._temporal
else:
return ["No temporal parameters set"]

@property
def spatial(self):
"""
Return the spatial object, which provides the underlying functionality for validating
and formatting geospatial objects. The spatial object has several properties to enable
user access to the stored spatial extent in multiple formats.

See Also
--------
spatial.Spatial.spatial_extent
spatial.Spatial.extent_type
spatial.Spatial.extent_file
spatial.Spatial

Examples
--------
>>> reg_a = ipx.GenQuery([-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> reg_a.spatial # doctest: +SKIP
<icepyx.core.spatial.Spatial at [location]>

>>> print(reg_a.spatial)
Extent type: bounding_box
Coordinates: [-55.0, 68.0, -48.0, 71.0]

"""
return self._spatial

@property
def spatial_extent(self):
"""
Return an array showing the spatial extent of the query object.
Spatial extent is returned as an input type (which depends on how
you initially entered your spatial data) followed by the geometry data.
Bounding box data is [lower-left-longitude, lower-left-latitute, upper-right-longitude, upper-right-latitude].
Polygon data is [longitude1, latitude1, longitude2, latitude2,
... longitude_n,latitude_n, longitude1,latitude1].

Returns
-------
tuple of length 2
First tuple element is the spatial type ("bounding box" or "polygon").
Second tuple element is the spatial extent as a list of coordinates.

Examples
--------

# Note: coordinates returned as float, not int
>>> reg_a = GenQuery([-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> reg_a.spatial_extent
('bounding_box', [-55.0, 68.0, -48.0, 71.0])

>>> reg_a = GenQuery([(-55, 68), (-55, 71), (-48, 71), (-48, 68), (-55, 68)],['2019-02-20','2019-02-28'])
>>> reg_a.spatial_extent
('polygon', [-55.0, 68.0, -55.0, 71.0, -48.0, 71.0, -48.0, 68.0, -55.0, 68.0])

# NOTE Is this where we wanted to put the file-based test/example?
# The test file path is: examples/supporting_files/simple_test_poly.gpkg

See Also
--------
Spatial.extent
Spatial.extent_type
Spatial.extent_as_gdf

"""

return (self._spatial._ext_type, self._spatial._spatial_ext)

@property
def dates(self):
"""
Return an array showing the date range of the query object.
Dates are returned as an array containing the start and end datetime objects, inclusive, in that order.

Examples
--------
>>> reg_a = ipx.GenQuery([-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> reg_a.dates
['2019-02-20', '2019-02-28']

>>> reg_a = GenQuery([-55, 68, -48, 71])
>>> reg_a.dates
['No temporal parameters set']
"""
if not hasattr(self, "_temporal"):
return ["No temporal parameters set"]
else:
return [
self._temporal._start.strftime("%Y-%m-%d"),
self._temporal._end.strftime("%Y-%m-%d"),
] # could also use self._start.date()

@property
def start_time(self):
"""
Return the start time specified for the start date.

Examples
--------
>>> reg_a = ipx.GenQuery([-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> reg_a.start_time
'00:00:00'

>>> reg_a = ipx.GenQuery([-55, 68, -48, 71],['2019-02-20','2019-02-28'], start_time='12:30:30')
>>> reg_a.start_time
'12:30:30'

>>> reg_a = GenQuery([-55, 68, -48, 71])
>>> reg_a.start_time
['No temporal parameters set']
"""
if not hasattr(self, "_temporal"):
return ["No temporal parameters set"]
else:
return self._temporal._start.strftime("%H:%M:%S")

@property
def end_time(self):
"""
Return the end time specified for the end date.

Examples
--------
>>> reg_a = ipx.GenQuery([-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> reg_a.end_time
'23:59:59'

>>> reg_a = ipx.GenQuery([-55, 68, -48, 71],['2019-02-20','2019-02-28'], end_time='10:20:20')
>>> reg_a.end_time
'10:20:20'

>>> reg_a = GenQuery([-55, 68, -48, 71])
>>> reg_a.end_time
['No temporal parameters set']
"""
if not hasattr(self, "_temporal"):
return ["No temporal parameters set"]
else:
return self._temporal._end.strftime("%H:%M:%S")


# DevGoal: update docs throughout to allow for polygon spatial extent
# Note: add files to docstring once implemented
Expand Down Expand Up @@ -333,174 +502,6 @@ def product_version(self):
"""
return self._version

@property
def temporal(self):
"""
Return the Temporal object containing date/time range information for the query object.

See Also
--------
temporal.Temporal.start
temporal.Temporal.end
temporal.Temporal

Examples
--------
>>> reg_a = Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> print(reg_a.temporal)
Start date and time: 2019-02-20 00:00:00
End date and time: 2019-02-28 23:59:59

>>> reg_a = Query('ATL06',[-55, 68, -48, 71],cycles=['03','04','05','06','07'], tracks=['0849','0902'])
>>> print(reg_a.temporal)
['No temporal parameters set']
"""

if hasattr(self, "_temporal"):
return self._temporal
else:
return ["No temporal parameters set"]

@property
def spatial(self):
"""
Return the spatial object, which provides the underlying functionality for validating
and formatting geospatial objects. The spatial object has several properties to enable
user access to the stored spatial extent in multiple formats.

See Also
--------
spatial.Spatial.spatial_extent
spatial.Spatial.extent_type
spatial.Spatial.extent_file
spatial.Spatial

Examples
--------
>>> reg_a = ipx.Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> reg_a.spatial # doctest: +SKIP
<icepyx.core.spatial.Spatial at [location]>

>>> print(reg_a.spatial)
Extent type: bounding_box
Coordinates: [-55.0, 68.0, -48.0, 71.0]

"""
return self._spatial

@property
def spatial_extent(self):
"""
Return an array showing the spatial extent of the query object.
Spatial extent is returned as an input type (which depends on how
you initially entered your spatial data) followed by the geometry data.
Bounding box data is [lower-left-longitude, lower-left-latitute, upper-right-longitude, upper-right-latitude].
Polygon data is [longitude1, latitude1, longitude2, latitude2,
... longitude_n,latitude_n, longitude1,latitude1].

Returns
-------
tuple of length 2
First tuple element is the spatial type ("bounding box" or "polygon").
Second tuple element is the spatial extent as a list of coordinates.

Examples
--------

# Note: coordinates returned as float, not int
>>> reg_a = Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> reg_a.spatial_extent
('bounding_box', [-55.0, 68.0, -48.0, 71.0])

>>> reg_a = Query('ATL06',[(-55, 68), (-55, 71), (-48, 71), (-48, 68), (-55, 68)],['2019-02-20','2019-02-28'])
>>> reg_a.spatial_extent
('polygon', [-55.0, 68.0, -55.0, 71.0, -48.0, 71.0, -48.0, 68.0, -55.0, 68.0])

# NOTE Is this where we wanted to put the file-based test/example?
# The test file path is: examples/supporting_files/simple_test_poly.gpkg

See Also
--------
Spatial.extent
Spatial.extent_type
Spatial.extent_as_gdf

"""

return (self._spatial._ext_type, self._spatial._spatial_ext)

@property
def dates(self):
"""
Return an array showing the date range of the query object.
Dates are returned as an array containing the start and end datetime objects, inclusive, in that order.

Examples
--------
>>> reg_a = ipx.Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> reg_a.dates
['2019-02-20', '2019-02-28']

>>> reg_a = Query('ATL06',[-55, 68, -48, 71],cycles=['03','04','05','06','07'], tracks=['0849','0902'])
>>> reg_a.dates
['No temporal parameters set']
"""
if not hasattr(self, "_temporal"):
return ["No temporal parameters set"]
else:
return [
self._temporal._start.strftime("%Y-%m-%d"),
self._temporal._end.strftime("%Y-%m-%d"),
] # could also use self._start.date()

@property
def start_time(self):
"""
Return the start time specified for the start date.

Examples
--------
>>> reg_a = ipx.Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> reg_a.start_time
'00:00:00'

>>> reg_a = ipx.Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-28'], start_time='12:30:30')
>>> reg_a.start_time
'12:30:30'

>>> reg_a = Query('ATL06',[-55, 68, -48, 71],cycles=['03','04','05','06','07'], tracks=['0849','0902'])
>>> reg_a.start_time
['No temporal parameters set']
"""
if not hasattr(self, "_temporal"):
return ["No temporal parameters set"]
else:
return self._temporal._start.strftime("%H:%M:%S")

@property
def end_time(self):
"""
Return the end time specified for the end date.

Examples
--------
>>> reg_a = ipx.Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-28'])
>>> reg_a.end_time
'23:59:59'

>>> reg_a = ipx.Query('ATL06',[-55, 68, -48, 71],['2019-02-20','2019-02-28'], end_time='10:20:20')
>>> reg_a.end_time
'10:20:20'

>>> reg_a = Query('ATL06',[-55, 68, -48, 71],cycles=['03','04','05','06','07'], tracks=['0849','0902'])
>>> reg_a.end_time
['No temporal parameters set']
"""
if not hasattr(self, "_temporal"):
return ["No temporal parameters set"]
else:
return self._temporal._end.strftime("%H:%M:%S")

@property
def cycles(self):
"""
Expand Down
Empty file removed icepyx/quest/__init__.py
Empty file.
Loading