Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: testing workflow for pint #990

Closed
5igno opened this issue Jan 15, 2020 · 12 comments
Closed

Question: testing workflow for pint #990

5igno opened this issue Jan 15, 2020 · 12 comments
Labels

Comments

@5igno
Copy link
Contributor

5igno commented Jan 15, 2020

Hi,

I am a happy user of pint and cannot thank you enough for making it available as OSS.
I like the way it makes unit conversion and dimensional analysis effortless, all in a small package.
Brilliant :)
To be more accurate, I am a physicist working in optics and electrical engineering. As many users in this circle, I have growing need for logarithmic units as dB, decade, octave, etc.

As the support for these units is missing, I am trying to implement Logarithmic unit conversion myself (see the commits on my fork of pint), but I didn't understand well enough how to test pint as required by your guidelines.

Could you comment on the process you use to test pint?

@5igno 5igno changed the title Setting up a forum / chat to discuss issues Testing Environment / Setting up a forum to discuss issues Jan 15, 2020
@5igno
Copy link
Contributor Author

5igno commented Jan 17, 2020

For testing the modified library should I follow the methods in the guide here?
pytest/Good Integration Practices

@5igno
Copy link
Contributor Author

5igno commented Jan 17, 2020

I am finding this answer on pip install -e useful ... but is this the workflow you use to develop pint?

@5igno
Copy link
Contributor Author

5igno commented Jan 20, 2020

@hgrecco, would you agree that testing guidelines from numpy also apply to pint? Sorry for the beginners question, but I have no idea of what I am doing ... some reference would be gold.

@dalito
Copy link
Contributor

dalito commented Jan 20, 2020

Hi, I am not sure where your troubles come from. The steps are:

  • Create a virtual environment (VE)
  • Activate this VE
  • Install your pint (run pip install -e . in the root of your local pint-fork)
  • Install optional add-ons like e.g. numpy into your VE (pip install numpy)
  • Install pytest into your VE (pip install pytest)
  • Execute pytest at the root of your source folder. It will discover and run all tests.

If you want to test different Python or numpy versions or combinations locally, I suggest to use tox (or nox). pint has no config file for tox or nox at the moment. To see what is tested routinely look at .travis.yml.

@hgrecco
Copy link
Owner

hgrecco commented Jan 22, 2020

I think @dalito provides a great summary. Just one more thing, the numpy guide is great but is intended for a mucha larger project. Therefore, some things like testing only parts, might be an overkill.

@5igno
Copy link
Contributor Author

5igno commented Jan 22, 2020

Thank you for your input and your patience: I am new to contributing to an open-source projects and wanted to understand the proper way to proceed, rather than submit half-made work.

As for the point I raised about discourse, it seems you guys are fine with using Github issues for Q&A and label them with question. Is that a fair assessment?

@dalito
Copy link
Contributor

dalito commented Jan 22, 2020

@5igno, asking such questions here is fine from my point of view. Given the low volume, I think using Git issues is sufficient for pint at the moment.

@hgrecco
Copy link
Owner

hgrecco commented Jan 23, 2020

I agree with @dalito on this.

@5igno
Copy link
Contributor Author

5igno commented Jan 23, 2020

OK, i will modify the question accordingly.

@5igno 5igno changed the title Testing Environment / Setting up a forum to discuss issues Question: testing workflow for pint Jan 23, 2020
@hgrecco
Copy link
Owner

hgrecco commented Jan 23, 2020

And as I said, feel free to ask any question on the tracker. I am proud and happy to say that the community is very welcoming and helpful

@5igno
Copy link
Contributor Author

5igno commented Feb 4, 2020

Alright... so I am back to trying submitting a simple PR. I followed the workflow of @dalito but, when running pytest, I get reported of the existance of one error and many warnings, that is without having modified the code in the development branch.

=== 1 failed, 1036 passed, 27 skipped, 5 xfailed, 26 warnings in 23.94s ===

Should I neglect the warnings of pytest?

@jules-ch
Copy link
Collaborator

jules-ch commented Feb 4, 2020

About the warnings you can skip them in your case, those are deprecation warnings that we need to address.
Pint is running a CI Pipeline (Continuous Integration) with Travis which run all those tests remotely. So don't be afraid to submit a PR & we can tell you what's wrong with it, if some tests are failing.

bors bot added a commit that referenced this issue Jun 17, 2020
1116: Harmonize most doctests with Pint's current behavior r=hgrecco a=clarkgwillison

- [ ] Closes # (no single issue)
- [x] Executed ``black -t py36 . && isort -rc . && flake8`` with no errors
- [x] The change is fully covered by automated unit tests
- [x] Documented in docs/ as appropriate
- [x] Added an entry to the CHANGES file

This PR partially addresses #947, #972, and #990 

After merging, the number of failing doctests in the Sphinx documentation should go from 92 (as mentioned in #947) down to 3:
```
Doctest summary
===============
  335 tests
    3 failures in tests
    0 failures in setup code
    0 failures in cleanup code
build finished with problems.
make: *** [doctest] Error 1
```
Which will put us well in reach of enabling doctests in Travis to prevent documentation regressions in the future.

Most tests were fixed in this PR by deferring to the current behavior of Pint, however `Quantity.__repr__()` was modified to round floating point magnitudes to 9 digits to avoid several test failures that were being caused by floating point ambiguities.

Issue #1115 was opened to track the 3 tests that I could not easily resolve. Once that issue is resolved, we can enable doctests in Travis without breaking CI.

Co-authored-by: Clark Willison <clarkgwillison@gmail.com>
@jules-ch jules-ch closed this as completed Mar 1, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants