Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Overview
This PR addresses a couple of small testing related issues.
Firstly, the recently introduced in #761 compare the results of the LAUE functions as ordered lists, when, in fact, the order is not neither guaranteed nor important. Here we change that comparison to first sort both arrays and then compare them so the order is ignored.
Secondly the pytest-cov plugin doesn't count code that is running in a separate process like when using a ProcessPoolExecutor.
To address the
multiprocessing
coverage, we can configure a.coveragerc
file like the following,coverage
will automatically start a new trace for each process created.Then we need to change the coverage report generation script like the following:
After including these changes
Here is a function that is exclusively called in a separate process:

All told this brings the codecov up from the undercounted ~47% to the more correctly counted ~48%.
Note: pytest-cov also doesn't count
@numba.jit
functions. However, they represent <1% of the code in the library, and disabling jit for coverage tests (the commonly accepted answer) results in 5x runtime for the test cases. So we will continue to jit the functions and not worry about that.