Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature request] - Continuous integration testing with Pytest #46

Closed
9 of 11 tasks
shimwell opened this issue Apr 12, 2024 · 4 comments
Closed
9 of 11 tasks

[Feature request] - Continuous integration testing with Pytest #46

shimwell opened this issue Apr 12, 2024 · 4 comments
Labels
bug Something isn't working

Comments

@shimwell
Copy link
Collaborator

shimwell commented Apr 12, 2024

There is a nice opportunity to add some CI to this repo and test the code as it gets updated.

A non exhaustive list of things that could be tested

  • installation routes to check the software can always be installed
  • conversion of CAD to mcnp, serpent, phits and openmc
  • public CI based simulation is not possible for mcnp and serpent due to license but we could do still add the pytests and make the local testing easier for devs
  • compare csg text files with previously made csg text files that we know work in simulation (regression test)
  • conversion of CAD to openmc
  • simulation of openmc files and check result with manually made geometry that matches the CAD or DAGMC geometry (perhaps similar to model benchmark zoo)
  • test different python versions
  • test use with different operating systems
  • test with different methods of installing and connecting with freecad
  • unit tests for functionality of code
  • regression tests to make sure we still get the same outputs

I'm keen to contribute the above todo list if allowed.

Just raising an issue to check if others are keen on seeing this testing added?

I've made a start over here if anyone is interested

@shimwell shimwell added the bug Something isn't working label Apr 12, 2024
@dodu94
Copy link
Member

dodu94 commented Apr 23, 2024

For this kind of software performance is very important. Performance benchmarks tests may be very important (especially if a big refactoring will take place) to make sure that translation time does not increase dramatically if not intended to do so.

@shimwell
Copy link
Collaborator Author

Perhaps this package is useful for the performance testing
https://github.com/ionelmc/pytest-benchmark

@AlvaroCubi AlvaroCubi changed the title [Feature request] - Continious intergration testing with Pytest [Feature request] - Continuous integration testing with Pytest Apr 24, 2024
@alberto743
Copy link
Member

In my humble opinion, software performance shall not be put as a priority.
As Donald Knuth says, "premature optimization is the root of all evil".

More specifically, we need instead to have object persistence (e.g. via pickle) to let the user interactively save the state of partially converted models.
At the same time, the correctness of the conversions shall be verified against specific unit and integral test cases, to have as a top priority the reliability of the results.
Continuous integration pipelines should trigger a battery of fast test cases capable of covering the features of the software.

@shimwell shimwell mentioned this issue May 23, 2024
3 tasks
@alexvalentine94
Copy link
Collaborator

Closing as mostly completed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants