Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move configuration from setup.py to setup.cfg #296

Merged
merged 11 commits into from
Mar 18, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .coveragerc
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,5 @@
branch = True
omit =
*/tests/*
*/_version.py
*/_version_generated.py
**/__init__.py
35 changes: 0 additions & 35 deletions .flake8

This file was deleted.

77 changes: 34 additions & 43 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
# token stolen if the Action is compromised. See the comments and links here:
# https://github.com/pypa/gh-action-pypi-publish/issues/27
#
name: docs
name: documentation

# Only build PRs, the main branch, and releases. Pushes to branches will only
# be built when a PR is opened. This avoids duplicated buids in PRs comming
Expand Down Expand Up @@ -32,7 +32,7 @@ jobs:
build:
runs-on: ubuntu-latest
env:
REQUIREMENTS: requirements.txt env/requirements-docs.txt
REQUIREMENTS: env/requirements-build.txt env/requirements-docs.txt
PYTHON: 3.9

steps:
Expand Down Expand Up @@ -61,45 +61,44 @@ jobs:
- name: Fetch git tags
run: git fetch origin 'refs/tags/*:refs/tags/*'

- name: Setup caching for conda packages
uses: actions/cache@v2
with:
path: ~/conda_pkgs_dir
key: conda-${{ runner.os }}-${{ env.PYTHON }}-${{ hashFiles('requirements*.txt') }}

- name: Setup Miniconda
uses: conda-incubator/setup-miniconda@v2.0.1
uses: conda-incubator/setup-miniconda@v2
with:
python-version: ${{ env.PYTHON }}
miniconda-version: "latest"
auto-update-conda: true
channels: conda-forge
show-channel-urls: true
activate-environment: testing
miniforge-variant: Mambaforge
use-mamba: true
channels: conda-forge,defaults
# Needed for caching
use-only-tar-bz2: true

- name: Install requirements
- name: Collect requirements - run-time
run: python tools/export_requirements.py > requirements-full.txt

- name: Collect requirements - other
run: |
requirements_file=requirements-full.txt
if [ ! -z "$REQUIREMENTS" ]; then
echo "Capturing dependencies from $REQUIREMENTS"
for requirement in $REQUIREMENTS
do
cat $requirement >> $requirements_file
done
fi
if [ -f $requirements_file ]; then
echo "Collected dependencies:"
cat $requirements_file
echo ""
conda install --quiet --file $requirements_file python=$PYTHON
else
echo "No requirements defined."
fi
echo "Capturing dependencies from:"
for requirement in $REQUIREMENTS
do
echo " $requirement"
cat $requirement >> requirements-full.txt
done

- name: List requirements
run: |
echo "Collected dependencies:"
cat requirements-full.txt

- name: Setup caching for conda packages
uses: actions/cache@v2
with:
path: ~/conda_pkgs_dir
key: conda-${{ runner.os }}-${{ env.PYTHON }}-${{ hashFiles('requirements-full.txt') }}

- name: Install requirements
run: mamba install --quiet --file requirements-full.txt python=$PYTHON

- name: List installed packages
run: conda list
run: mamba list

- name: Build source and wheel distributions
run: |
Expand All @@ -109,23 +108,15 @@ jobs:
ls -lh dist/

- name: Install the package
run: pip install --no-deps dist/*.whl

- name: Copy test data to cache
run: |
echo "Copy data to " $HARMONICA_DATA_DIR/main
set -x -e
mkdir -p $HARMONICA_DATA_DIR/main
cp -r data/* $HARMONICA_DATA_DIR/main
env:
# Define directory where sample data will be copied
HARMONICA_DATA_DIR: ${{ runner.temp }}/cache/harmonica
run: python -m pip install --no-deps dist/*.whl

- name: Build the documentation
run: make -C doc clean all

# Store the docs as a build artifact so we can deploy it later
- name: Upload HTML documentation as an artifact
# Only if not a pull request
if: success() && github.event_name != 'pull_request'
uses: actions/upload-artifact@v2
with:
name: docs-${{ github.sha }}
Expand Down
46 changes: 39 additions & 7 deletions .github/workflows/pypi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,10 @@
#
name: pypi

# Only run for pushes to the main branch and releases.
# Runs on these events but only publish on pushes to main (to test pypi) and
# releases
on:
pull_request:
push:
branches:
- main
Expand All @@ -21,12 +23,11 @@ defaults:
shell: bash

jobs:

#############################################################################
# Publish built wheels and source archives to PyPI and test PyPI
publish:
# Build and check source and wheel distributions
build:
runs-on: ubuntu-latest
# Only publish from the origin repository, not forks
if: github.repository_owner == 'fatiando'

steps:
# Checks-out your repository under $GITHUB_WORKSPACE
Expand Down Expand Up @@ -63,7 +64,7 @@ jobs:
# Change setuptools-scm local_scheme to "no-local-version" so the
# local part of the version isn't included, making the version string
# compatible with Test PyPI.
sed --in-place "s/node-and-date/no-local-version/g" setup.py
sed --in-place "s/node-and-date/no-local-version/g" pyproject.toml

- name: Build source and wheel distributions
run: |
Expand All @@ -75,8 +76,39 @@ jobs:
- name: Check the archives
run: twine check dist/*

# Store the archives as a build artifact so we can deploy them later
- name: Upload archives as artifacts
# Only if not a pull request
if: success() && github.event_name != 'pull_request'
uses: actions/upload-artifact@v2
with:
name: pypi-${{ github.sha }}
path: dist

#############################################################################
# Publish built wheels and source archives to PyPI and test PyPI
publish:
runs-on: ubuntu-latest
# Only publish from the origin repository, not forks
if: github.repository_owner == 'fatiando' && github.event_name != 'pull_request'

steps:
- name: Checkout
uses: actions/checkout@v2
with:
# The GitHub token is preserved by default but this job doesn't need
# to be able to push to GitHub.
persist-credentials: false

- name: Download built source and wheel packages
uses: actions/download-artifact@v2
with:
name: pypi-${{ github.sha }}
path: dist

- name: Publish to Test PyPI
if: success()
# Only publish to TestPyPI when a PR is merged (pushed to main)
if: success() && github.event_name == 'push'
uses: pypa/gh-action-pypi-publish@bce3b74dbf8cc32833ffba9d15f83425c1a736e0
with:
user: __token__
Expand Down
76 changes: 39 additions & 37 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,32 +21,34 @@ on:
# Use bash by default in all jobs
defaults:
run:
# The -l {0} is necessary for conda environments to be activated
# But this breaks on MacOS if using actions/setup-python:
# https://github.com/actions/setup-python/issues/132
shell: bash

jobs:
#############################################################################
# Run tests and upload to codecov
test:
name: ${{ matrix.os }} py${{ matrix.python }}
name: ${{ matrix.os }} python=${{ matrix.python }} dependencies=${{ matrix.dependencies }}
runs-on: ${{ matrix.os }}-latest
strategy:
# Otherwise, the workflow would stop if a single job fails. We want to
# run all of them to catch failures in different combinations.
fail-fast: false
matrix:
os: [ubuntu, macos, windows]
python: ["3.6", "3.9"]
# If "optional", will install non-required dependencies in the build
# environment. Otherwise, only required dependencies are installed.
dependencies: [""]
os:
- ubuntu
- macos
- windows
python:
- "3.6"
- "3.9"
dependencies:
- latest
env:
REQUIREMENTS: requirements.txt env/requirements-tests.txt
REQUIREMENTS: env/requirements-tests.txt
# Used to tag codecov submissions
OS: ${{ matrix.os }}
PYTHON: ${{ matrix.python }}
DEPENDENCIES: ${{ matrix.dependencies }}

steps:
# Cancel any previous run of the test job
Expand Down Expand Up @@ -79,6 +81,23 @@ jobs:
with:
python-version: ${{ matrix.python }}

- name: Collect requirements - run-time
run: python tools/export_requirements.py > requirements-full.txt

- name: Collect requirements - other
run: |
echo "Capturing dependencies from:"
for requirement in $REQUIREMENTS
do
echo " $requirement"
cat $requirement >> requirements-full.txt
done

- name: List requirements
run: |
echo "Collected dependencies:"
cat requirements-full.txt

- name: Get the pip cache folder
id: pip-cache
run: |
Expand All @@ -88,34 +107,14 @@ jobs:
uses: actions/cache@v2
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements*.txt') }}
restore-keys: |
${{ runner.os }}-pip-${{ hashFiles('**/requirements*.txt') }}
key: ${{ runner.os }}-pip-${{ hashFiles('requirements-full.txt') }}

- name: Install requirements
run: |
requirements_file=requirements-full.txt
if [ ! -z "$REQUIREMENTS" ]; then
echo "Capturing dependencies from $REQUIREMENTS"
for requirement in $REQUIREMENTS
do
cat $requirement >> $requirements_file
done
fi
if [ -f $requirements_file ]; then
echo "Collected dependencies:"
cat $requirements_file
echo ""
# Install wheel before anything else so pip can use wheels for
# other packages.
python -m pip install -r env/requirements-build.txt
python -m pip install -r $requirements_file
else
echo "No requirements defined."
fi

- name: List installed packages
run: python -m pip freeze
# Install the build requirements before anything else so pip can use
# wheels for other packages.
python -m pip install --requirement env/requirements-build.txt
python -m pip install --requirement requirements-full.txt

- name: Build source and wheel distributions
run: |
Expand All @@ -125,7 +124,10 @@ jobs:
ls -lh dist/

- name: Install the package
run: pip install --no-deps dist/*.whl
run: python -m pip install --no-deps dist/*.whl

- name: List installed packages
run: python -m pip freeze

- name: Copy test data to cache
run: |
Expand Down Expand Up @@ -159,7 +161,7 @@ jobs:
with:
token: ${{ secrets.CODECOV_TOKEN }}
file: ./coverage.xml
env_vars: OS,PYTHON
env_vars: OS,PYTHON,DEPENDENCIES
# Don't mark the job as failed if the upload fails for some reason.
# It does sometimes but shouldn't be the reason for running
# everything again unless something else is broken.
Expand Down
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -18,4 +18,4 @@ doc/sample_data
MANIFEST
dask-worker-space
.coverage.*
harmonica/_version.py
harmonica/_version_generated.py
3 changes: 0 additions & 3 deletions .isort.cfg

This file was deleted.

Loading