Skip to content

Commit

Permalink
Merge branch 'master' into dependabot/pip/requirements/pyvista-0.41.1
Browse files Browse the repository at this point in the history
  • Loading branch information
PProfizi authored Aug 16, 2023
2 parents 2c5818d + 19ae597 commit 0f80c62
Show file tree
Hide file tree
Showing 208 changed files with 3,618 additions and 1,235 deletions.
1 change: 1 addition & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,7 @@ jobs:
cname: ${{ env.DOCUMENTATION_CNAME }}
token: ${{ secrets.GITHUB_TOKEN }}
doc-artifact-name: HTML-doc-ansys-dpf-core.zip
decompress-artifact: true

examples:
if: startsWith(github.head_ref, 'master') || github.event.action == 'ready_for_review' || !github.event.pull_request.draft
Expand Down
14 changes: 0 additions & 14 deletions .github/workflows/ci_release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -184,11 +184,6 @@ jobs:
needs: [style, tests, docs, examples, retro_232, retro_231, retro_222, retro_221, gate, docker_tests]
runs-on: ubuntu-latest
steps:
- name: "Set up Python"
uses: actions/setup-python@v4.6.0
with:
python-version: 3.9

- name: "Download artifacts"
uses: actions/download-artifact@v3

Expand All @@ -204,12 +199,3 @@ jobs:
./**/*.zip
draft: true
generate_release_notes: true

# - name: "Upload to Test PyPi" # Change TOKEN
# run: |
# pip install twine
# twine upload --repository testpypi --skip-existing ./**/*.whl
# twine upload --repository testpypi --skip-existing ./**/*.tar.gz
# env:
# TWINE_USERNAME: __token__
# TWINE_PASSWORD: ${{ secrets.TEST_PYPI_API_TOKEN }}
2 changes: 1 addition & 1 deletion .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -180,5 +180,5 @@ jobs:
uses: actions/upload-artifact@v3
with:
name: HTML-doc-${{env.PACKAGE_NAME}}.zip
path: docs/build/html
path: HTML-doc-${{env.PACKAGE_NAME}}.zip
if: always()
17 changes: 11 additions & 6 deletions .github/workflows/releaser.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,16 +39,21 @@ jobs:
file: HTML-doc-ansys-dpf-core.zip
token: ${{ secrets.GITHUB_TOKEN }}

# - name: "Unzip HTML Documentation"
# shell: bash
# run: |
# unzip HTML-doc-ansys-dpf-core.zip -d documentation-html
# chmod -R 777 documentation-html
# if: always()
- name: "List downloaded assets"
shell: bash
run: |
ls
- name: "Upload artifact"
uses: actions/upload-artifact@v3
with:
name: HTML-doc-ansys-dpf-core.zip
path: HTML-doc-ansys-dpf-core.zip

- name: "Deploy the stable documentation"
uses: ansys/actions/doc-deploy-stable@v4
with:
cname: ${{ env.DOCUMENTATION_CNAME }}
token: ${{ secrets.GITHUB_TOKEN }}
doc-artifact-name: HTML-doc-ansys-dpf-core.zip
decompress-artifact: true
34 changes: 26 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@
[![cov](https://codecov.io/gh/ansys/pydpf-core/branch/master/graph/badge.svg)](https://codecov.io/gh/ansys/pydpf-core)
[![codacy](https://app.codacy.com/project/badge/Grade/61b6a519aea64715ad1726b3955fcf98)](https://www.codacy.com/gh/ansys/pydpf-core/dashboard?utm_source=github.com&utm_medium=referral&utm_content=ansys/pydpf-core&utm_campaign=Badge_Grade)

The Data Processing Framework (DPF) provides numerical simulation
Ansys Data Processing Framework (DPF) provides numerical simulation
users and engineers with a toolbox for accessing and transforming simulation
data. With DPF, you can perform complex preprocessing or postprocessing of
large amounts of simulation data within a simulation workflow.
Expand All @@ -27,8 +27,8 @@ The latest version of DPF supports Ansys solver result files for:
- Fluent (`.cas/dat.h5`, `.flprj`)
- CFX (`.cad/dat.cff`, `.flprj`)

See the `PyDPF-Core main page <https://dpf.docs.pyansys.com/version/stable/index.html>`_
for more information on compatibility.
For more information on compatibility, see the `main page <https://dpf.docs.pyansys.com/version/stable/index.html>`_
of the PDF-Core documentation.

Using the many DPF operators that are available, you can manipulate and
transform this data. You can also chain operators together to create simple
Expand All @@ -47,12 +47,30 @@ The ``ansys.dpf.core`` package provides a Python interface to DPF, enabling
rapid postprocessing of a variety of Ansys file formats and physics solutions
without ever leaving the Python environment.

## Documentation
## Documentation and issues

Visit the [DPF-Core Documentation](https://dpfdocs.pyansys.com) for
comprehensive information on this library. See the
[Examples](https://dpfdocs.pyansys.com/version/stable/examples/index.html)
for how-to information.
Documentation for the latest stable release of PyPDF-Core is hosted at
[DPF-Core documentation](https://dpf.docs.pyansys.com/version/stable/).

In the upper right corner of the documentation's title bar, there is an option for switching from
viewing the documentation for the latest stable release to viewing the documentation for the
development version or previously released versions.

In the upper right corner of the documentation's title bar, there is an option for switching from
viewing the documentation for the latest stable release to viewing the documentation for the
development version or previously released versions.

You can also [view](https://cheatsheets.docs.pyansys.com/pydpf-core_cheat_sheet.png) or
[download](https://cheatsheets.docs.pyansys.com/pydpf-core_cheat_sheet.pdf) the
PyDPF-Core cheat sheet. This one-page reference provides syntax rules and commands
for using PyDPF-Core.

On the [PyDPF-Core Issues](https://github.com/ansys/pydpf-core/issues) page,
you can create issues to report bugs and request new features. On the
[PyDPF-Core Discussions](https://github.com/ansys/pydpf-core/discussions) page or the {Discussions](https://discuss.ansys.com/)
page on the Ansys Developer portal, you can post questions, share ideas, and get community feedback.

To reach the project support team, email [pyansys.core@ansys.com](mailto:pyansys.core@ansys.com).

## Installation

Expand Down
28 changes: 14 additions & 14 deletions docs/source/_static/dpf_operators.html

Large diffs are not rendered by default.

4 changes: 4 additions & 0 deletions docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,6 +86,10 @@
"sphinx_gallery.gen_gallery",
]

typehints_defaults = "comma"
typehints_use_signature = True
simplify_optional_unions = False

# Intersphinx mapping
intersphinx_mapping = {
"pyvista": ("https://docs.pyvista.org/", None),
Expand Down
22 changes: 22 additions & 0 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -153,6 +153,28 @@ It is independent of the Ansys installer.

- `C++ solver reader plugin <https://astonishing-hyacinth-e64.notion.site/How-to-write-a-new-solver-reader-as-a-DPF-s-plugin-bd2d2a3cf51f47ef9e70df45d64f89cb>`_

Documentation and issues
------------------------
Documentation for the latest stable release of PyDPF-Core is hosted at `PyDPF-Core documentation
<https://dpf.docs.pyansys.com/version/stable/>`_.

In the upper right corner of the documentation's title bar, there is an option for switching from
viewing the documentation for the latest stable release to viewing the documentation for the
development version or previously released versions.

You can also `view <https://cheatsheets.docs.pyansys.com/pydpf-core_cheat_sheet.png>`_ or
`download <https://cheatsheets.docs.pyansys.com/pydpf-core_cheat_sheet.pdf>`_ the
PyDPF-Core cheat sheet. This one-page reference provides syntax rules and commands
for using PyDPF-Core.

On the `PyDPF-Core Issues <https://github.com/ansys/pydpf-core/issues>`_ page,
you can create issues to report bugs and request new features. On the `PyDPF-Core Discussions
<https://github.com/ansys/pydpf-core/discussions>`_ page or the `Discussions <https://discuss.ansys.com/>`_
page on the Ansys Developer portal, you can post questions, share ideas, and get community feedback.

To reach the project support team, email `pyansys.core@ansys.com <pyansys.core@ansys.com>`_.


- :ref:`user_guide_custom_operators`


Expand Down
2 changes: 1 addition & 1 deletion src/ansys/dpf/core/any.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ class Any:
Parameters
----------
any : ctypes.c_void_p, ansys.grpc.dpf.any_pb2.Any message, optional # noqa: E501
any_dpf : ctypes.c_void_p, ansys.grpc.dpf.any_pb2.Any message, optional
server : DPFServer, optional
Server with channel connected to the remote or local instance.
The default is ``None``, in which case an attempt is made to use the
Expand Down
1 change: 1 addition & 0 deletions src/ansys/dpf/core/operators/filter/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
from .field_low_pass_fc import field_low_pass_fc
from .field_signed_high_pass import field_signed_high_pass
from .field_signed_high_pass_fc import field_signed_high_pass_fc
from .filtering_max_over_time import filtering_max_over_time
from .scoping_band_pass import scoping_band_pass
from .scoping_high_pass import scoping_high_pass
from .scoping_low_pass import scoping_low_pass
Expand Down
55 changes: 54 additions & 1 deletion src/ansys/dpf/core/operators/filter/abc_weightings.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,11 @@ class abc_weightings(Operator):
computed, 1 the b-weigting is
computed and 2 the c-weightings is
computed.
shape_by_tf_scoping : bool
If this pin is set to true, each field of the
input fields container is defined by
time freq scoping and not by ids.
default is false
Examples
Expand All @@ -36,19 +41,27 @@ class abc_weightings(Operator):
>>> op.inputs.fields_container.connect(my_fields_container)
>>> my_weighting_type = int()
>>> op.inputs.weighting_type.connect(my_weighting_type)
>>> my_shape_by_tf_scoping = bool()
>>> op.inputs.shape_by_tf_scoping.connect(my_shape_by_tf_scoping)
>>> # Instantiate operator and connect inputs in one line
>>> op = dpf.operators.filter.abc_weightings(
... fields_container=my_fields_container,
... weighting_type=my_weighting_type,
... shape_by_tf_scoping=my_shape_by_tf_scoping,
... )
>>> # Get output data
>>> result_weightings = op.outputs.weightings()
"""

def __init__(
self, fields_container=None, weighting_type=None, config=None, server=None
self,
fields_container=None,
weighting_type=None,
shape_by_tf_scoping=None,
config=None,
server=None,
):
super().__init__(name="abc_weightings", config=config, server=server)
self._inputs = InputsAbcWeightings(self)
Expand All @@ -57,6 +70,8 @@ def __init__(
self.inputs.fields_container.connect(fields_container)
if weighting_type is not None:
self.inputs.weighting_type.connect(weighting_type)
if shape_by_tf_scoping is not None:
self.inputs.shape_by_tf_scoping.connect(shape_by_tf_scoping)

@staticmethod
def _spec():
Expand All @@ -81,6 +96,15 @@ def _spec():
computed and 2 the c-weightings is
computed.""",
),
2: PinSpecification(
name="shape_by_tf_scoping",
type_names=["bool"],
optional=False,
document="""If this pin is set to true, each field of the
input fields container is defined by
time freq scoping and not by ids.
default is false""",
),
},
map_output_pin_spec={
0: PinSpecification(
Expand Down Expand Up @@ -142,6 +166,8 @@ class InputsAbcWeightings(_Inputs):
>>> op.inputs.fields_container.connect(my_fields_container)
>>> my_weighting_type = int()
>>> op.inputs.weighting_type.connect(my_weighting_type)
>>> my_shape_by_tf_scoping = bool()
>>> op.inputs.shape_by_tf_scoping.connect(my_shape_by_tf_scoping)
"""

def __init__(self, op: Operator):
Expand All @@ -150,6 +176,10 @@ def __init__(self, op: Operator):
self._inputs.append(self._fields_container)
self._weighting_type = Input(abc_weightings._spec().input_pin(1), 1, op, -1)
self._inputs.append(self._weighting_type)
self._shape_by_tf_scoping = Input(
abc_weightings._spec().input_pin(2), 2, op, -1
)
self._inputs.append(self._shape_by_tf_scoping)

@property
def fields_container(self):
Expand Down Expand Up @@ -194,6 +224,29 @@ def weighting_type(self):
"""
return self._weighting_type

@property
def shape_by_tf_scoping(self):
"""Allows to connect shape_by_tf_scoping input to the operator.
If this pin is set to true, each field of the
input fields container is defined by
time freq scoping and not by ids.
default is false
Parameters
----------
my_shape_by_tf_scoping : bool
Examples
--------
>>> from ansys.dpf import core as dpf
>>> op = dpf.operators.filter.abc_weightings()
>>> op.inputs.shape_by_tf_scoping.connect(my_shape_by_tf_scoping)
>>> # or
>>> op.inputs.shape_by_tf_scoping(my_shape_by_tf_scoping)
"""
return self._shape_by_tf_scoping


class OutputsAbcWeightings(_Outputs):
"""Intermediate class used to get outputs from
Expand Down
Loading

0 comments on commit 0f80c62

Please sign in to comment.