Skip to content

Commit

Permalink
Merge branch 'NCAR:stable' into develop_satellite
Browse files Browse the repository at this point in the history
  • Loading branch information
mlirenzhenmayi authored Sep 6, 2023
2 parents 51b7539 + 4f2efa3 commit 3da061c
Show file tree
Hide file tree
Showing 22 changed files with 938 additions and 435 deletions.
14 changes: 11 additions & 3 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -25,21 +25,26 @@ jobs:
- uses: actions/checkout@v3

- name: Set up Python (micromamba)
uses: mamba-org/provision-with-micromamba@v13
uses: mamba-org/provision-with-micromamba@v15
with:
environment-file: environment-dev.yml
cache-env: true
extra-specs: |
python=${{ matrix.python-version }}
- name: Test with pytest
run: pytest -n auto -v
run: pytest -n auto -v -k 'not aqs'

- name: Test with pytspack installed
run: |
pip install https://github.com/noaa-oar-arl/pytspack/archive/master.zip
pytest -n auto -v -k with_pytspack
- name: Downgrade OpenSSL and test AQS
run: |
micromamba install 'openssl <3'
pytest -n auto -v -k aqs
docs:
name: Check docs build
runs-on: ubuntu-latest
Expand All @@ -52,11 +57,14 @@ jobs:
- uses: actions/checkout@v3

- name: Set up Python (micromamba)
uses: mamba-org/provision-with-micromamba@v13
uses: mamba-org/provision-with-micromamba@v15
with:
environment-file: docs/environment-docs.yml
cache-env: true

- name: Downgrade OpenSSL (for AQS URL linkcheck)
run: micromamba install 'openssl <3'

- name: linkcheck
run: sphinx-build -b linkcheck docs docs/_build/linkcheck

Expand Down
10 changes: 5 additions & 5 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,30 +1,30 @@
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: "v4.3.0"
rev: "v4.4.0"
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-docstring-first
- id: check-yaml

- repo: https://github.com/asottile/pyupgrade
rev: "v2.38.0"
rev: "v3.3.1"
hooks:
- id: pyupgrade
args: [--py36-plus]

- repo: https://github.com/PyCQA/isort
rev: "5.10.1"
rev: "5.12.0"
hooks:
- id: isort

- repo: https://github.com/psf/black
rev: "22.8.0"
rev: "23.1.0"
hooks:
- id: black

- repo: https://github.com/PyCQA/flake8
rev: "5.0.4"
rev: "6.0.0"
hooks:
- id: flake8

Expand Down
5 changes: 5 additions & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,11 @@
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = "sphinx"

linkcheck_ignore = [
"https://doi.org/10.1080/10473289.2005.10464718",
"https://www.camx.com",
]

# -- Extension configuration -------------------------------------------------

extlinks = {
Expand Down
26 changes: 13 additions & 13 deletions docs/tutorial/improve_trends_kmeans.rst
Original file line number Diff line number Diff line change
Expand Up @@ -67,7 +67,7 @@ Now we will load the data (in this case the file is
self._setitem_with_indexer(indexer, value)
Lets look at the dataframe
Let's look at the dataframe.

.. code-block:: python
Expand Down Expand Up @@ -438,8 +438,8 @@ Lets look at the dataframe



Now this is in the long pandas format. Lets use the
monet.util.tools.long_to_wide utility to reformat the dataframe into a
Now this is in the long pandas format. Let's use the
``monet.util.tools.long_to_wide`` utility to reformat the dataframe into a
wide format.

.. code-block:: python
Expand Down Expand Up @@ -622,7 +622,7 @@ wide format.



Lets now plot some of the different measurements with time from a site.
Let's now plot some of the different measurements with time from a site.
In this case we will look at the PHOE1 site in Phoenix, Arizona.

.. code-block:: python
Expand Down Expand Up @@ -828,7 +828,7 @@ Let’s look at SIf as an example from ACAD1.
.. image:: improve_trends_kmeans_files/improve_trends_kmeans_11_1.png


Now this is good but lets resample to see if we can see a trend.
Now this is good, but let's resample to see if we can see a trend.

.. code-block:: python
Expand All @@ -840,10 +840,10 @@ Now this is good but lets resample to see if we can see a trend.
.. image:: improve_trends_kmeans_files/improve_trends_kmeans_13_0.png


Simply resampling is fine but lets try to get a signal out using a
kolmogorov-zerbenko filter. See
https://www.tandfonline.com/doi/pdf/10.1080/10473289.2005.10464718 for
more information
Simply resampling is fine, but let's try to get a signal out using a
Kolmogorov--Zurbenko filter. See
https://doi.org/10.1080/10473289.2005.10464718 for
more information.

.. code-block:: python
Expand Down Expand Up @@ -878,7 +878,7 @@ some tools from sklearn to use in our analysis.

.. code-block:: python
from sklearn.preprocessing import RobustScaler #to scale our data
from sklearn.preprocessing import RobustScaler # to scale our data
from sklearn.cluster import KMeans # clustering algorithm
First we want to separate out different variables that may be useful
Expand Down Expand Up @@ -968,7 +968,7 @@ NaN values so let us go ahead and do that.


Usually, with sklearn it is better to scale the data first before
putting it through the algorithm. We will use th RobustScaler to do
putting it through the algorithm. We will use the RobustScaler to do
this.

.. code-block:: python
Expand All @@ -983,14 +983,14 @@ analysis.
km = KMeans(n_clusters=2).fit(X_scaled)
The clusters can be found under km.labels\_ . These are integers
The clusters can be found under ``km.labels_``. These are integers
representing the different clusters.

.. code-block:: python
clusters = km.labels_
Lets plot this so that we can see where there is dust.
Let's plot this so that we can see where there is dust.

.. code-block:: python
Expand Down
1 change: 1 addition & 0 deletions environment-dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ dependencies:
#
# optional
- joblib
- lxml
- pyhdf
- requests
#
Expand Down
2 changes: 1 addition & 1 deletion monetio/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
from .profile import geoms, icartt, tolnet
from .sat import goes

__version__ = "0.2.2"
__version__ = "0.2.3"

__all__ = [
"__version__",
Expand Down
2 changes: 1 addition & 1 deletion monetio/models/_wrfchem_mm.py
Original file line number Diff line number Diff line change
Expand Up @@ -189,7 +189,7 @@ def open_mfdataset(
if "pm25_om" in list_calc_sum:
dset = add_lazy_om_pm25(dset, dict_sum)

dset = dset.reset_index(["XTIME", "datetime"], drop=True)
dset = dset.reset_coords(["XTIME", "datetime"], drop=True)
if not surf_only_nc:
# Reset more variables
dset = dset.rename(
Expand Down
1 change: 0 additions & 1 deletion monetio/models/cdump2netcdf.py
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,6 @@ def handle_levels(levlist):

# def cdump2awips(flist, outname, format='NETCDF4', d1=None, d2=None):
def cdump2awips(xrash1, dt, outname, mscale=1, munit="unit", format="NETCDF4", d1=None, d2=None):

# mass loading should be in g/m2 to compare to satellite.
# concentration should be in mg/m3 to compare to threshold levels.

Expand Down
Loading

0 comments on commit 3da061c

Please sign in to comment.