Skip to content

Commit

Permalink
Squashed commit of the following:
Browse files Browse the repository at this point in the history
commit 54c3cbe52bd33a94ccbb5bb44f2958bb3afc9330
Author: LL-Geo <54405391+LL-Geo@users.noreply.github.com>
Date:   Thu Jun 23 21:36:23 2022 +0800

    Update filter

    Add more filter

commit 4a5d6f1
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Thu Jun 16 18:00:32 2022 -0300

    Avoid checking floats in tesseroid doctests (fatiando#326)

    Remove expected results for tesseroid calculations in docstring examples.
    Printing floats in forward modelling examples isn't that meaningful and often
    creates failures when running doctests: small differences between the expected
    and the got value could occur under some dependency and OS combinations.

commit cc697af
Author: Matt Tankersley <81199856+mdtanker@users.noreply.github.com>
Date:   Fri Jun 17 08:24:32 2022 +1200

    Add progressbar to prism forward gravity calculations (fatiando#315)

    Add optional `progressbar` flag to `prism_gravity` function and to the
    `gravity` method of the prism layer accesor to print a progress bar using
    `numba_progress`. Add `numba_progress` as optional dependency. Add test
    functions for the new feature.

commit 5a1c895
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Tue Jun 14 13:20:39 2022 -0300

    Specify spherical latitude in point sources guide (fatiando#325)

    Replaces latitude for spherical latitude in another place of the
    `point.rst`. Fix typo on "Alternatively".

commit cb476b2
Author: Federico Esteban <federico.esteban@gmail.com>
Date:   Tue Jun 14 11:33:22 2022 -0300

    Note that spherical and geodetic latitudes are equal in spherical ellipsoids (fatiando#324)

    Add sentence in the Coordinate Systems section of the User Guide noting that
    if the reference ellipsoid were a sphere both the spherical latitude and the
    geodetic latitude are equivalent.

commit 1256ff6
Author: Federico Esteban <federico.esteban@gmail.com>
Date:   Mon Jun 13 11:35:02 2022 -0300

    Add Federico Esteban to AUTHORS.md (fatiando#323)

    Add his name, link to his GitHub account, affiliation and ORCID number.

commit 32de6e0
Author: Federico Esteban <federico.esteban@gmail.com>
Date:   Thu Jun 9 15:44:48 2022 -0300

    Specify "spherical latitude" when describing coordinates of point masses (fatiando#321)

    Add "spherical" when describing the spherical latitude coordinate of point
    masses in the user guide. This way we differentiate it from the "latitude"
    geodetic coordinate.

commit 9667fab
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Mon Jun 6 11:05:17 2022 -0300

    Fix small format errors in the user guide (fatiando#319)

    Fix link to EquivalentSources.predict method and fix superscripts in the docs.

commit 2f7fcb6
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Fri Jun 3 11:17:51 2022 -0300

    Update docs and create a proper user guide (fatiando#305)

    Update Sphinx docs using sphinx-panels.
    Add a proper User Guide that will ultimately replace the gallery examples.
    Each page of the new User Guide is a .rst file that uses jupyter-sphinx to run example code blocks.
    Added pages for: Coordinate systems, Forward Modelling, Gravity corrections and Equivalent Sources.
    Added a new doc/versions.rst file with links to previous documentations.

commit cf4080c
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Tue May 31 15:58:25 2022 -0300

    Compute upward derivative of a grid in the frequency domain (fatiando#238)

    Define a new derivative_upward function for computing the spatial upward
    derivative of a 2D grid in the frequency domain. The function makes use of
    xrft for handling Fourier transformations of xarray objects. Add a new
    filters subpackage that includes FFT filters: functions that take grids in
    frequency domain and return the desired filter also in frequency domain. Add
    fft and ifft wrapper functions of the xrft.fft and xrft.ifft ones. Add
    a new apply_filter function that takes a grid in the spatial domain, applies
    fft, the filter and ifft and returns the filtered grid also in spatial domain.
    Add tests for the new features and a gallery example for the upward derivative.
    Add netcdf4 as requirement for testing.

commit 6a30797
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Fri May 27 16:22:18 2022 -0300

    Ditch soon-to-be deprecated args of equivalent sources grid method (fatiando#311)

    The grid() method of Verde gridders now take a coordinates argument with
    the coordinates of the target grid. The previous region, shape and
    spacing arguments will be deprecated in Verde v2.0.0. This change makes it
    easier for our equivalent sources classes: we don't need the extra upward
    argument, users can create the coordinates of the target grid using
    verde.grid_coordinates and pass them via coordinates argument. Ditch the
    upward, shape, spacing and region arguments from the equivalent sources
    gridders. Replace them for the new coordinates argument: users need to
    provide the coordinates of the target grid instead of building it through the
    grid method. Raise errors if any of those old arguments are being passed. Raise
    warnings if any kwargs are passed: they are being ignored and not passed to the
    BaseGridder.grid() method.

commit 51ceb7e
Author: Agustina <pesce.agustina@gmail.com>
Date:   Mon May 23 11:03:54 2022 -0300

    Remove deprecated point_mass_gravity function (fatiando#310)

    Remove point_mass_gravity function from harmonica because it was deprecated on
    PR fatiando#280. Remove related test functions.

commit f336aa8
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Thu May 5 14:52:08 2022 -0300

    Drop support for Python 3.6 (fatiando#309)

    Remove the compatibility metadata, remove from the CI matrix, bump the
    python_requires to 3.7+.

commit d132abb
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Tue May 3 12:47:22 2022 -0300

    Add computation of gravitational tensor components for point sources (fatiando#288)

    Add new kernel functions to compute gravity tensor components generated by
    point sources. Add test functions for the new feature: check that the diagonal
    elements satisfy the Laplace equation, compare all components against finite
    difference computations from the gravity acceleration. Add test class for
    checking the symmetry of tensor components. Refactor old test functions for
    point gravity: merge some functions into single ones through pytest
    parametrizations. Avoid using "gradient" for specifying the gravity
    acceleration vector: the "gravity gradient" is usually used to refer to the
    tensor.

commit eb71d54
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Fri Apr 22 17:23:00 2022 -0300

    Add deprecations to datasets and synthetic modules (fatiando#304)

    Add FutureWarnings to public functions of the synthetic and dataset
    modules. Add tests for the new warnings. Both modules will be deprecated in
    Harmonica v0.6.0. Instead of providing sample datasets, Harmonica will depend
    on Ensaio for that. The synthetic surveys depend on some of the sample
    datasets, but those functions are intended to be used in methodology articles,
    so they should live somewhere else.

commit a4598ef
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Fri Apr 22 17:06:43 2022 -0300

    Add conversion of prisms or a prism layer to PyVista objects (fatiando#291)

    Add a new visualization module that hosts prism_to_pyvista: a function to
    convert a set of prisms into a pyvista.UnstructuredGrid. Include the new
    module and this function in the API Reference. Add a new to_pyvista() method
    to the PrismLayer accessor that converts a prism layer into a pyvista grid,
    making it easier to plot it in 3D. The UnstructuredGrid has the information
    about each prism as hexahedrons, along with their physical properties as cell
    data. Add tests for the new features. Add pyvista and vtk as optional
    dependencies to environment.yml and setup.cfg. Add a new example for
    plotting a PrismLayer. Configure Sphinx to show pyvista plots in the gallery
    and to use the pyvista-plot directive in docstrings.

commit 762d210
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Mon Apr 4 14:09:45 2022 -0300

    Update Black to its stable version (fatiando#301)

    Black has released a stable version: 22.3.0. Now the style check tests use this
    version. Fixes a bug on CI in which Black was trying to import a private module
    of click that doesn't exist anymore. Rerun black: now Black hugs simple
    power operators.

commit 10577fa
Author: Santiago Soler <santiago.r.soler@gmail.com>
Date:   Mon Apr 4 14:00:42 2022 -0300

    Update Sphinx version to 4.5.0 (fatiando#302)

    Updates also sphinx gallery and sphinx book theme.
    This fixes a issue between latest jinja2 and Sphinx 3.5.*.

commit f880065
Author: Leonardo Uieda <leouieda@gmail.com>
Date:   Fri Mar 18 13:34:51 2022 +0000

    Move configuration from setup.py to setup.cfg (fatiando#296)

    Make the move away from setup.py following the recommendations from the
    Python packaging guides. Moves the requirement listing to setup.cfg as
    well and will use a script to extract this for conda installing on CI.
  • Loading branch information
LL-Geo committed Jun 23, 2022
1 parent 5a837d2 commit 42239ad
Show file tree
Hide file tree
Showing 85 changed files with 10,522 additions and 678 deletions.
2 changes: 1 addition & 1 deletion .coveragerc
Original file line number Diff line number Diff line change
Expand Up @@ -2,5 +2,5 @@
branch = True
omit =
*/tests/*
*/_version.py
*/_version_generated.py
**/__init__.py
35 changes: 0 additions & 35 deletions .flake8

This file was deleted.

89 changes: 45 additions & 44 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
# token stolen if the Action is compromised. See the comments and links here:
# https://github.com/pypa/gh-action-pypi-publish/issues/27
#
name: docs
name: documentation

# Only build PRs, the main branch, and releases. Pushes to branches will only
# be built when a PR is opened. This avoids duplicated buids in PRs comming
Expand Down Expand Up @@ -32,7 +32,7 @@ jobs:
build:
runs-on: ubuntu-latest
env:
REQUIREMENTS: requirements.txt env/requirements-docs.txt
REQUIREMENTS: env/requirements-build.txt env/requirements-docs.txt
PYTHON: 3.9

steps:
Expand Down Expand Up @@ -61,45 +61,44 @@ jobs:
- name: Fetch git tags
run: git fetch origin 'refs/tags/*:refs/tags/*'

- name: Setup caching for conda packages
uses: actions/cache@v2
with:
path: ~/conda_pkgs_dir
key: conda-${{ runner.os }}-${{ env.PYTHON }}-${{ hashFiles('requirements*.txt') }}

- name: Setup Miniconda
uses: conda-incubator/setup-miniconda@v2.0.1
uses: conda-incubator/setup-miniconda@v2
with:
python-version: ${{ env.PYTHON }}
miniconda-version: "latest"
auto-update-conda: true
channels: conda-forge
show-channel-urls: true
activate-environment: testing
miniforge-variant: Mambaforge
use-mamba: true
channels: conda-forge,defaults
# Needed for caching
use-only-tar-bz2: true

- name: Install requirements
- name: Collect requirements - run-time
run: python tools/export_requirements.py > requirements-full.txt

- name: Collect requirements - other
run: |
requirements_file=requirements-full.txt
if [ ! -z "$REQUIREMENTS" ]; then
echo "Capturing dependencies from $REQUIREMENTS"
for requirement in $REQUIREMENTS
do
cat $requirement >> $requirements_file
done
fi
if [ -f $requirements_file ]; then
echo "Collected dependencies:"
cat $requirements_file
echo ""
conda install --quiet --file $requirements_file python=$PYTHON
else
echo "No requirements defined."
fi
echo "Capturing dependencies from:"
for requirement in $REQUIREMENTS
do
echo " $requirement"
cat $requirement >> requirements-full.txt
done
- name: List requirements
run: |
echo "Collected dependencies:"
cat requirements-full.txt
- name: Setup caching for conda packages
uses: actions/cache@v2
with:
path: ~/conda_pkgs_dir
key: conda-${{ runner.os }}-${{ env.PYTHON }}-${{ hashFiles('requirements-full.txt') }}

- name: Install requirements
run: mamba install --quiet --file requirements-full.txt python=$PYTHON

- name: List installed packages
run: conda list
run: mamba list

- name: Build source and wheel distributions
run: |
Expand All @@ -109,23 +108,25 @@ jobs:
ls -lh dist/
- name: Install the package
run: pip install --no-deps dist/*.whl

- name: Copy test data to cache
run: |
echo "Copy data to " $HARMONICA_DATA_DIR/main
set -x -e
mkdir -p $HARMONICA_DATA_DIR/main
cp -r data/* $HARMONICA_DATA_DIR/main
env:
# Define directory where sample data will be copied
HARMONICA_DATA_DIR: ${{ runner.temp }}/cache/harmonica
run: python -m pip install --no-deps dist/*.whl

- name: Build the documentation
run: make -C doc clean all
run: |
# Install xvfb and run some commands to allow pyvista to run on
# a headless system.
sudo apt-get install xvfb
export DISPLAY=:99.0
export PYVISTA_OFF_SCREEN=true
export PYVISTA_USE_IPYVTK=true
Xvfb :99 -screen 0 1024x768x24 > /dev/null 2>&1 &
sleep 3
# Build the docs
make -C doc clean all
# Store the docs as a build artifact so we can deploy it later
- name: Upload HTML documentation as an artifact
# Only if not a pull request
if: success() && github.event_name != 'pull_request'
uses: actions/upload-artifact@v2
with:
name: docs-${{ github.sha }}
Expand Down
46 changes: 39 additions & 7 deletions .github/workflows/pypi.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,10 @@
#
name: pypi

# Only run for pushes to the main branch and releases.
# Runs on these events but only publish on pushes to main (to test pypi) and
# releases
on:
pull_request:
push:
branches:
- main
Expand All @@ -21,12 +23,11 @@ defaults:
shell: bash

jobs:

#############################################################################
# Publish built wheels and source archives to PyPI and test PyPI
publish:
# Build and check source and wheel distributions
build:
runs-on: ubuntu-latest
# Only publish from the origin repository, not forks
if: github.repository_owner == 'fatiando'

steps:
# Checks-out your repository under $GITHUB_WORKSPACE
Expand Down Expand Up @@ -63,7 +64,7 @@ jobs:
# Change setuptools-scm local_scheme to "no-local-version" so the
# local part of the version isn't included, making the version string
# compatible with Test PyPI.
sed --in-place "s/node-and-date/no-local-version/g" setup.py
sed --in-place "s/node-and-date/no-local-version/g" pyproject.toml
- name: Build source and wheel distributions
run: |
Expand All @@ -75,8 +76,39 @@ jobs:
- name: Check the archives
run: twine check dist/*

# Store the archives as a build artifact so we can deploy them later
- name: Upload archives as artifacts
# Only if not a pull request
if: success() && github.event_name != 'pull_request'
uses: actions/upload-artifact@v2
with:
name: pypi-${{ github.sha }}
path: dist

#############################################################################
# Publish built wheels and source archives to PyPI and test PyPI
publish:
runs-on: ubuntu-latest
# Only publish from the origin repository, not forks
if: github.repository_owner == 'fatiando' && github.event_name != 'pull_request'

steps:
- name: Checkout
uses: actions/checkout@v2
with:
# The GitHub token is preserved by default but this job doesn't need
# to be able to push to GitHub.
persist-credentials: false

- name: Download built source and wheel packages
uses: actions/download-artifact@v2
with:
name: pypi-${{ github.sha }}
path: dist

- name: Publish to Test PyPI
if: success()
# Only publish to TestPyPI when a PR is merged (pushed to main)
if: success() && github.event_name == 'push'
uses: pypa/gh-action-pypi-publish@bce3b74dbf8cc32833ffba9d15f83425c1a736e0
with:
user: __token__
Expand Down
76 changes: 39 additions & 37 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,32 +21,34 @@ on:
# Use bash by default in all jobs
defaults:
run:
# The -l {0} is necessary for conda environments to be activated
# But this breaks on MacOS if using actions/setup-python:
# https://github.com/actions/setup-python/issues/132
shell: bash

jobs:
#############################################################################
# Run tests and upload to codecov
test:
name: ${{ matrix.os }} py${{ matrix.python }}
name: ${{ matrix.os }} python=${{ matrix.python }} dependencies=${{ matrix.dependencies }}
runs-on: ${{ matrix.os }}-latest
strategy:
# Otherwise, the workflow would stop if a single job fails. We want to
# run all of them to catch failures in different combinations.
fail-fast: false
matrix:
os: [ubuntu, macos, windows]
python: ["3.6", "3.9"]
# If "optional", will install non-required dependencies in the build
# environment. Otherwise, only required dependencies are installed.
dependencies: [""]
os:
- ubuntu
- macos
- windows
python:
- "3.7"
- "3.9"
dependencies:
- latest
env:
REQUIREMENTS: requirements.txt env/requirements-tests.txt
REQUIREMENTS: env/requirements-tests.txt
# Used to tag codecov submissions
OS: ${{ matrix.os }}
PYTHON: ${{ matrix.python }}
DEPENDENCIES: ${{ matrix.dependencies }}

steps:
# Cancel any previous run of the test job
Expand Down Expand Up @@ -79,6 +81,23 @@ jobs:
with:
python-version: ${{ matrix.python }}

- name: Collect requirements - run-time
run: python tools/export_requirements.py > requirements-full.txt

- name: Collect requirements - other
run: |
echo "Capturing dependencies from:"
for requirement in $REQUIREMENTS
do
echo " $requirement"
cat $requirement >> requirements-full.txt
done
- name: List requirements
run: |
echo "Collected dependencies:"
cat requirements-full.txt
- name: Get the pip cache folder
id: pip-cache
run: |
Expand All @@ -88,34 +107,14 @@ jobs:
uses: actions/cache@v2
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-pip-${{ hashFiles('**/requirements*.txt') }}
restore-keys: |
${{ runner.os }}-pip-${{ hashFiles('**/requirements*.txt') }}
key: ${{ runner.os }}-pip-${{ hashFiles('requirements-full.txt') }}

- name: Install requirements
run: |
requirements_file=requirements-full.txt
if [ ! -z "$REQUIREMENTS" ]; then
echo "Capturing dependencies from $REQUIREMENTS"
for requirement in $REQUIREMENTS
do
cat $requirement >> $requirements_file
done
fi
if [ -f $requirements_file ]; then
echo "Collected dependencies:"
cat $requirements_file
echo ""
# Install wheel before anything else so pip can use wheels for
# other packages.
python -m pip install -r env/requirements-build.txt
python -m pip install -r $requirements_file
else
echo "No requirements defined."
fi
- name: List installed packages
run: python -m pip freeze
# Install the build requirements before anything else so pip can use
# wheels for other packages.
python -m pip install --requirement env/requirements-build.txt
python -m pip install --requirement requirements-full.txt
- name: Build source and wheel distributions
run: |
Expand All @@ -125,7 +124,10 @@ jobs:
ls -lh dist/
- name: Install the package
run: pip install --no-deps dist/*.whl
run: python -m pip install --no-deps dist/*.whl

- name: List installed packages
run: python -m pip freeze

- name: Copy test data to cache
run: |
Expand Down Expand Up @@ -159,7 +161,7 @@ jobs:
with:
token: ${{ secrets.CODECOV_TOKEN }}
file: ./coverage.xml
env_vars: OS,PYTHON
env_vars: OS,PYTHON,DEPENDENCIES
# Don't mark the job as failed if the upload fails for some reason.
# It does sometimes but shouldn't be the reason for running
# everything again unless something else is broken.
Expand Down
3 changes: 1 addition & 2 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -11,11 +11,10 @@ dist/
doc/_build
doc/api/generated
doc/gallery
doc/tutorials
doc/sample_data
.ipynb_checkpoints
*.egg-info
MANIFEST
dask-worker-space
.coverage.*
harmonica/_version.py
harmonica/_version_generated.py
3 changes: 0 additions & 3 deletions .isort.cfg

This file was deleted.

Loading

0 comments on commit 42239ad

Please sign in to comment.