Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use GitHub actions for unit testing #745

Merged
merged 68 commits into from
Sep 20, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
68 commits
Select commit Hold shift + click to select a range
11a9eb8
Add GitHub Actions and Dockerfile for test environment
peanutfun Jan 12, 2023
dc37cec
Fix env var query in github workflow
peanutfun Jan 12, 2023
7a880a3
Revert "Fix env var query in github workflow"
peanutfun Jan 12, 2023
8eb75e2
Try specifying Docker tag with variables
peanutfun Jan 12, 2023
dfefe7b
Hardcode Docker Hub namespace
peanutfun Jan 12, 2023
bec8823
Conditionally run jobs and try to fix unittest job
peanutfun Jan 12, 2023
f5ef2d5
Fix syntax error in github workflow
peanutfun Jan 12, 2023
a4806c0
Do not call make in github workflow
peanutfun Jan 12, 2023
84223c4
Try fixing the shell inside Docker container for github workflow
peanutfun Jan 12, 2023
d9b3696
Call micromamba directly in github workflow
peanutfun Jan 12, 2023
9bf1054
Use 'micromamba install' because 'update' does not update pip deps
peanutfun Jan 13, 2023
b9e9c25
Split GitHub workflows
peanutfun Jan 13, 2023
9302bf2
Consolidate GitHub Workflows
peanutfun Jan 16, 2023
bb320bb
Try fixing syntax for GitHub workflow expression
peanutfun Jan 16, 2023
e43e582
Fix typo in docker workflow
peanutfun Jan 16, 2023
0022546
Fix list input for docker.yml workflow
peanutfun Jan 16, 2023
36c5cd1
Run all tests in GitHub workflow
peanutfun Jan 16, 2023
e38a835
Do not use 'defaults' in workflow
peanutfun Jan 16, 2023
9e3b15b
Fix stray 'defaults' in workflow
peanutfun Jan 16, 2023
089ba8e
Rename testing steps
peanutfun Jan 16, 2023
9be8632
Do not use multiprocessing for integration tests
peanutfun Jan 16, 2023
ad5cb0f
Revert climada.conf to 9302bf2a
peanutfun Jan 16, 2023
58601ed
Revert "Do not use multiprocessing for integration tests"
peanutfun Jan 17, 2023
ab73601
Skip failing integration tests for now
peanutfun Jan 20, 2023
3fb11cc
Add action to publish test results
peanutfun Jan 20, 2023
c6e8949
Run pipeline on branches and pull requests
peanutfun Jan 20, 2023
b68c4a3
Grant permissions from top-level job
peanutfun Jan 20, 2023
bc5499b
Use other XML test reports for displaying results
peanutfun Jan 20, 2023
d8d2bda
Merge branch 'develop' into github-actions
peanutfun Feb 9, 2023
7bf2c5e
Add new workflow using micromamba directly
peanutfun Feb 9, 2023
a2afaa7
Add data and notebook tests to new workflow
peanutfun Feb 9, 2023
fe64e65
Separately build conda environment
peanutfun Feb 24, 2023
615779d
Add integration tests and separate test result reporting
peanutfun Feb 24, 2023
ff98765
Fix missing 'runs-on'
peanutfun Feb 24, 2023
95505f1
ci: Add ipython to extra requirements
peanutfun Feb 24, 2023
b568c69
ci: Fix publish results
peanutfun Feb 24, 2023
aafd6b2
Remove all but the mamba pipelines
peanutfun Mar 28, 2023
18fb3ed
Run test matrix with different Python versions
peanutfun Mar 28, 2023
270afab
Fix Python version parsing in CI
peanutfun Mar 28, 2023
e6b8c1e
Add linting job to CI
peanutfun Mar 28, 2023
94d60f2
Skip integration tests in GitHub CI
peanutfun Mar 28, 2023
5bc9e5d
Use custom conda environment cache key in CI
peanutfun Mar 28, 2023
61d1cb4
Add matchers and custom linting step to CI
peanutfun Mar 28, 2023
1adb4bf
Ignore pylint errors in CI
peanutfun Mar 28, 2023
566b7fb
ci: Debug matcher path
peanutfun Mar 29, 2023
41bb4c2
ci: Lint with ruff
peanutfun Mar 29, 2023
60ee28f
ci: Fix Python target version for ruff
peanutfun Mar 29, 2023
dc3c556
ci: Add permissions for publishing checks to test job
peanutfun Mar 29, 2023
cda0f60
Revert "ci: Lint with ruff"
peanutfun Mar 29, 2023
38ef6ee
Merge branch 'develop' into github-actions
peanutfun Jun 23, 2023
c0ed9e1
Update CI configuration
peanutfun Jun 23, 2023
6ed83e6
Stop testing Python 3.11 due to incompatibilities with dataclass
peanutfun Jun 26, 2023
0dd27cb
Merge branch 'develop' into github-actions
peanutfun Jul 4, 2023
4807bab
Revert "Stop testing Python 3.11 due to incompatibilities with datacl…
peanutfun Jul 4, 2023
8dd46d7
Set environment caching for GitHub workflow
peanutfun Jul 4, 2023
e1e52a2
Remove unused files
peanutfun Jul 4, 2023
36f2ed4
Fix mutable default values in ImpactFreqCurve
peanutfun Aug 17, 2023
4335e22
Fix Nightlight and LitPop tests
peanutfun Aug 17, 2023
6f1da97
Merge branch 'develop' into github-actions
peanutfun Aug 17, 2023
109d994
Fix accuracy in default Emanuel impact func test
peanutfun Aug 17, 2023
c18cb76
Improve names in Github CI
peanutfun Aug 17, 2023
d55e0ea
Disallow Python 3.11 for Climada
peanutfun Aug 17, 2023
aff5f05
Merge branch 'develop' into github-actions
peanutfun Aug 28, 2023
a9a0914
Merge branch 'github-actions' of https://github.com/CLIMADA-project/c…
peanutfun Aug 28, 2023
ceab933
Try testing Python v3.11 again
peanutfun Aug 28, 2023
f42da0a
Improve GitHub Actions documentation
peanutfun Sep 19, 2023
f00bf3e
Merge branch 'develop' into github-actions
peanutfun Sep 19, 2023
d06051d
Merge branch 'develop' into github-actions
peanutfun Sep 19, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
70 changes: 70 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
name: GitHub CI

# Execute this for every push
on: [push]

# Use bash explicitly for being able to enter the conda environment
defaults:
run:
shell: bash -l {0}

jobs:
build-and-test:
name: Build Env, Install, Unit Tests
runs-on: ubuntu-latest
permissions:
# For publishing results
checks: write

# Run this test for different Python versions
strategy:
# Do not abort other tests if only a single one fails
fail-fast: false
matrix:
python-version: ["3.9", "3.10", "3.11"]

steps:
-
name: Checkout Repo
uses: actions/checkout@v3
-
# Store the current date to use it as cache key for the environment
name: Get current date
id: date
run: echo "date=$(date +%Y-%m-%d)" >> "${GITHUB_OUTPUT}"
-
name: Create Environment with Mamba
uses: mamba-org/setup-micromamba@v1
with:
environment-name: climada_env_${{ matrix.python-version }}
environment-file: requirements/env_climada.yml
create-args: >-
python=${{ matrix.python-version }}
make
init-shell: >-
bash
# Persist environment for branch, Python version, single day
cache-environment-key: env-${{ github.ref }}-${{ matrix.python-version }}-${{ steps.date.outputs.date }}
-
name: Install CLIMADA
run: |
python -m pip install ".[test]"
-
name: Run Unit Tests
run: |
make unit_test
-
name: Publish Test Results
uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
junit_files: tests_xml/tests.xml
check_name: "Unit Test Results Python ${{ matrix.python-version }}"
comment_mode: "off"
-
name: Upload Coverage Reports
if: always()
uses: actions/upload-artifact@v3
with:
name: coverage-report-unittests-py${{ matrix.python-version }}
path: coverage/
6 changes: 3 additions & 3 deletions climada/engine/impact.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@

__all__ = ['ImpactFreqCurve', 'Impact']

from dataclasses import dataclass
from dataclasses import dataclass, field
import logging
import copy
import csv
Expand Down Expand Up @@ -1785,10 +1785,10 @@ class ImpactFreqCurve():
"""Impact exceedence frequency curve.
"""

return_per : np.array = np.array([])
return_per : np.ndarray = field(default_factory=lambda: np.empty(0))
"""return period"""

impact : np.array = np.array([])
impact : np.ndarray = field(default_factory=lambda: np.empty(0))
"""impact exceeding frequency"""

unit : str = ''
Expand Down
2 changes: 1 addition & 1 deletion climada/entity/exposures/test/test_litpop.py
Original file line number Diff line number Diff line change
Expand Up @@ -317,7 +317,7 @@ def test_gridpoints_core_calc_offsets_exp_rescale(self):
self.assertEqual(result_array.shape, results_check.shape)
self.assertAlmostEqual(result_array.sum(), tot)
self.assertEqual(result_array[1,2], results_check[1,2])
np.testing.assert_array_almost_equal_nulp(result_array, results_check)
np.testing.assert_allclose(result_array, results_check)

def test_grp_read_pass(self):
"""test _grp_read() to pass and return either dict with admin1 values or None"""
Expand Down
52 changes: 0 additions & 52 deletions climada/entity/exposures/test/test_nightlight.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,22 +56,6 @@ def test_required_files(self):
self.assertRaises(ValueError, nightlight.get_required_nl_files,
(-90, 90))

def test_check_files_exist(self):
"""Test check_nightlight_local_file_exists"""
# If invalid directory is supplied it has to fail
try:
nightlight.check_nl_local_file_exists(
np.ones(np.count_nonzero(BM_FILENAMES)), 'Invalid/path')[0]
raise Exception("if the path is not valid, check_nl_local_file_exists should fail")
except ValueError:
pass
files_exist = nightlight.check_nl_local_file_exists(
np.ones(np.count_nonzero(BM_FILENAMES)), SYSTEM_DIR)
self.assertTrue(
files_exist.sum() > 0,
f'{files_exist} {BM_FILENAMES}'
)

def test_download_nightlight_files(self):
"""Test check_nightlight_local_file_exists"""
# Not the same length of arguments
Expand Down Expand Up @@ -118,42 +102,6 @@ def test_get_required_nl_files(self):
bool = np.array_equal(np.array([0, 0, 0, 0, 0, 0, 1, 0]), req_files)
self.assertTrue(bool)

def test_check_nl_local_file_exists(self):
""" Test that an array with the correct number of already existing files
is produced, the LOGGER messages logged and the ValueError raised. """

# check logger messages by giving a to short req_file
with self.assertLogs('climada.entity.exposures.litpop.nightlight', level='WARNING') as cm:
nightlight.check_nl_local_file_exists(required_files = np.array([0, 0, 1, 1]))
self.assertIn('The parameter \'required_files\' was too short and is ignored',
cm.output[0])

# check logger message: not all files are available
with self.assertLogs('climada.entity.exposures.litpop.nightlight', level='DEBUG') as cm:
nightlight.check_nl_local_file_exists()
self.assertIn('Not all satellite files available. Found ', cm.output[0])
self.assertIn(f' out of 8 required files in {Path(SYSTEM_DIR)}', cm.output[0])

# check logger message: no files found in checkpath
check_path = Path('climada/entity/exposures')
with self.assertLogs('climada.entity.exposures.litpop.nightlight', level='INFO') as cm:
# using a random path where no files are stored
nightlight.check_nl_local_file_exists(check_path=check_path)
self.assertIn(f'No satellite files found locally in {check_path}',
cm.output[0])

# test raises with wrong path
check_path = Path('/random/wrong/path')
with self.assertRaises(ValueError) as cm:
nightlight.check_nl_local_file_exists(check_path=check_path)
self.assertEqual(f'The given path does not exist: {check_path}',
str(cm.exception))

# test that files_exist is correct
files_exist = nightlight.check_nl_local_file_exists()
self.assertGreaterEqual(int(sum(files_exist)), 3)
self.assertLessEqual(int(sum(files_exist)), 8)

# Execute Tests
if __name__ == "__main__":
TESTS = unittest.TestLoader().loadTestsFromTestCase(TestNightLight)
Expand Down
39 changes: 24 additions & 15 deletions climada/entity/impact_funcs/test/test_tc.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,21 +39,30 @@ def test_default_values_pass(self):
self.assertTrue(np.array_equal(imp_fun.intensity, np.arange(0, 121, 5)))
self.assertTrue(np.array_equal(imp_fun.paa, np.ones((25,))))
self.assertTrue(np.array_equal(imp_fun.mdd[0:6], np.zeros((6,))))
self.assertTrue(np.array_equal(imp_fun.mdd[6:10],
np.array([0.0006753419543492556, 0.006790495604105169,
0.02425254393374475, 0.05758706257339458])))
self.assertTrue(np.array_equal(imp_fun.mdd[10:15],
np.array([0.10870556455111065, 0.1761433569521351,
0.2553983618763961, 0.34033822528795565,
0.4249447743109498])))
self.assertTrue(np.array_equal(imp_fun.mdd[15:20],
np.array([0.5045777092933046, 0.576424302849412,
0.6393091739184916, 0.6932203123193963,
0.7388256596555696])))
self.assertTrue(np.array_equal(imp_fun.mdd[20:25],
np.array([0.777104531116526, 0.8091124649261859,
0.8358522190681132, 0.8582150905529946,
0.8769633232141456])))
np.testing.assert_allclose(
imp_fun.mdd[6:25],
[
0.0006753419543492556,
0.006790495604105169,
0.02425254393374475,
0.05758706257339458,
0.10870556455111065,
0.1761433569521351,
0.2553983618763961,
0.34033822528795565,
0.4249447743109498,
0.5045777092933046,
0.576424302849412,
0.6393091739184916,
0.6932203123193963,
0.7388256596555696,
0.777104531116526,
0.8091124649261859,
0.8358522190681132,
0.8582150905529946,
0.8769633232141456,
],
)

def test_values_pass(self):
"""Compute mdr interpolating values."""
Expand Down
52 changes: 52 additions & 0 deletions climada/test/test_nightlight.py
Original file line number Diff line number Diff line change
Expand Up @@ -254,6 +254,58 @@ def test_untar_noaa_stable_nighlight(self):
self.assertIn('found more than one potential intensity file in', cm.output[0])
path_tar.unlink()

def test_check_nl_local_file_exists(self):
""" Test that an array with the correct number of already existing files
is produced, the LOGGER messages logged and the ValueError raised. """

# check logger messages by giving a to short req_file
with self.assertLogs('climada.entity.exposures.litpop.nightlight', level='WARNING') as cm:
nightlight.check_nl_local_file_exists(required_files = np.array([0, 0, 1, 1]))
self.assertIn('The parameter \'required_files\' was too short and is ignored',
cm.output[0])

# check logger message: not all files are available
with self.assertLogs('climada.entity.exposures.litpop.nightlight', level='DEBUG') as cm:
nightlight.check_nl_local_file_exists()
self.assertIn('Not all satellite files available. Found ', cm.output[0])
self.assertIn(f' out of 8 required files in {Path(SYSTEM_DIR)}', cm.output[0])

# check logger message: no files found in checkpath
check_path = Path('climada/entity/exposures')
with self.assertLogs('climada.entity.exposures.litpop.nightlight', level='INFO') as cm:
# using a random path where no files are stored
nightlight.check_nl_local_file_exists(check_path=check_path)
self.assertIn(f'No satellite files found locally in {check_path}',
cm.output[0])

# test raises with wrong path
check_path = Path('/random/wrong/path')
with self.assertRaises(ValueError) as cm:
nightlight.check_nl_local_file_exists(check_path=check_path)
self.assertEqual(f'The given path does not exist: {check_path}',
str(cm.exception))

# test that files_exist is correct
files_exist = nightlight.check_nl_local_file_exists()
self.assertGreaterEqual(int(sum(files_exist)), 3)
self.assertLessEqual(int(sum(files_exist)), 8)

def test_check_files_exist(self):
"""Test check_nightlight_local_file_exists"""
# If invalid directory is supplied it has to fail
try:
nightlight.check_nl_local_file_exists(
np.ones(np.count_nonzero(BM_FILENAMES)), 'Invalid/path')[0]
raise Exception("if the path is not valid, check_nl_local_file_exists should fail")
except ValueError:
pass
files_exist = nightlight.check_nl_local_file_exists(
np.ones(np.count_nonzero(BM_FILENAMES)), SYSTEM_DIR)
self.assertTrue(
files_exist.sum() > 0,
f'{files_exist} {BM_FILENAMES}'
)

# Execute Tests
if __name__ == "__main__":
TESTS = unittest.TestLoader().loadTestsFromTestCase(TestNightlight)
Expand Down
7 changes: 6 additions & 1 deletion doc/guide/Guide_Continuous_Integration_and_Testing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -299,7 +299,12 @@
"\n",
"- All tests must pass before submitting a pull request.\n",
"- Integration tests don't run on feature branches in Jenkins, therefore developers are requested to run them locally.\n",
"- After a pull request was accepted and the changes are merged to the develop branch, integration tests may still fail there and have to be addressed."
"- After a pull request was accepted and the changes are merged to the develop branch, integration tests may still fail there and have to be addressed.\n",
"\n",
"#### GitHub Actions\n",
"\n",
"We adopted test automation via GitHub Actions in an experimental state.\n",
"See [GitHub Actions CI](github-actions.rst) for details."
]
},
{
Expand Down
29 changes: 29 additions & 0 deletions doc/guide/github-actions.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
=================
GitHub Actions CI
=================

CLIMADA has been using a private Jenkins instance for automated testing (Continuous Integration, CI), see :doc:`Guide_Continuous_Integration_and_Testing`.
We recently adopted `GitHub Actions <https://docs.github.com/en/actions>`_ for automated unit testing.
GitHub Actions is a service provided by GitHub, which lets you configure CI/CD pipelines based on YAML configuration files.
GitHub provides servers which ample computational resources to create software environments, install software, test it, and deploy it.
See the `GitHub Actions Overview <https://docs.github.com/en/actions/learn-github-actions/understanding-github-actions>`_ for a technical introduction, and the `Workflow Syntax <https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions>`_ for a reference of the pipeline definitions.

The CI results for each pull request can be inspected in the "Checks" tab.
For GitHub Actions, users can inspect the logs of every step for every job.

.. note::

As of CLIMADA v4.0, the default CI technology remains Jenkins.
GitHub Actions CI is currently considered experimental for CLIMADA development.

---------------------
Unit Testing Pipeline
---------------------

This pipeline is defined by the ``.github/workflows/ci.yml`` file.
It contains a single job which will create a CLIMADA environment with Mamba for multiple Python versions, install CLIMADA, run the unit tests, and report the test coverage as well as the simplified test results.
The job has a `strategy <https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstrategy>`_ which runs it for multiple times for different Python versions.
This way, we make sure that CLIMADA is compatible with all currently supported versions of Python.

The coverage reports in HTML format will be uploaded as job artifacts and can be downloaded as ZIP files.
The test results are simple testing summaries that will appear as individual checks/jobs after the respective job completed.
1 change: 1 addition & 0 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,7 @@ Jump right in:
Performance and Best Practices <guide/Guide_Py_Performance>
Coding Conventions <guide/Guide_Miscellaneous>
Building the Documentation <README>
guide/github-actions


.. toctree::
Expand Down
2 changes: 1 addition & 1 deletion requirements/env_climada.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ dependencies:
- pycountry>=22.3
- pyepsg>=0.4
- pytables>=3.7
- python=3.9
- python>=3.9,<3.12
- pyxlsb>=1.0
- rasterio>=1.3
- requests>=2.31
Expand Down
2 changes: 2 additions & 0 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,8 @@

keywords='climate adaptation',

python_requires=">=3.9,<3.12",

install_requires=[
'bottleneck',
'cartopy',
Expand Down
Loading