Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create an integration testing suite #1324

Merged
merged 8 commits into from
Sep 18, 2023
Merged

Conversation

Saransh-cpp
Copy link
Contributor

@Saransh-cpp Saransh-cpp commented May 22, 2023

Xref: #1244

This PR adds a minimal testing suite to DeepXDE. There are a few options available for testing libraries in Python and the most widely used is pytest. pytest will run the files in the tests/ directory (starting with test_) and validate that every assert statement passes.

The tests check that the code works as intended, including the returned values, datatypes, and if the warnings and errors are raised correctly. This way, if someone now modifies one of the loss functions and ends up messing up with them, the tests will fail indicating that something wrong has gone with the particular loss function.

I have currently only implemented tests for the loss function for every backend. Some of the functions are only supported on particular backends, so they are skipped while other backends are tested. I've tried my best to include edge cases and make the tests reliable, but please let me know if I can add more tests for these functions.

These tests now also execute in the CI, which means every they will be executed on every PR. DeepXDE is already imported within the test module; hence, no need to explicitly test the import in the CI.

I have also added some minimal configuration for pytest in pyproject.toml and added it as a dev dependency, so that developers can install pytest with deepxde using pip install "deepxde[dev]". More dev dependencies will be added as we move further (including pytest-cov and pre-commit).

Finally, looking at #1313 (comment), all the tests of a particular backend only run with the Tensor or vector of that particular backend.


Once this is merged, the tests can be populated for other DeepXDE modules. I will add support for coverage after this so that we can visualise which modules are not tested DeepXDE (all of them right now, but that will change and coverage will catch that).

@Saransh-cpp Saransh-cpp marked this pull request as draft May 22, 2023 11:49
@Saransh-cpp Saransh-cpp marked this pull request as ready for review July 27, 2023 03:30
@Saransh-cpp
Copy link
Contributor Author

Sorry for sitting on this for too long, this should be ready for a review!

@lululxvi
Copy link
Owner

Thank you for this PR. It definitely makes important contributions to the testing. However, my concern is DeepXDE now has so many codes and functionalities, so it is really difficult to cover all the tests. It is just too much work.

My current thought is that we don't need to test detailed functions such as losses, and instead, we only test those example codes. If the examples work OK, then it should be fine. Any suggestions?

@Saransh-cpp
Copy link
Contributor Author

Oh yes, that sounds much better. Thanks for the review and the suggestion! I'll add tests for the notebooks and revamp this PR.

@Saransh-cpp Saransh-cpp marked this pull request as draft August 15, 2023 22:21
@Saransh-cpp Saransh-cpp changed the title Create a basic testing suite and add tests for dde.losses Create an integration testing suite Aug 31, 2023
@Saransh-cpp Saransh-cpp marked this pull request as ready for review September 9, 2023 16:06
Copy link
Contributor Author

@Saransh-cpp Saransh-cpp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be ready for review now.

This PR now updates and fixes the already existing Makefile in examples/ to run all the example files using make run_all_examples. I've added the same command in the CI and have parallelized the builds on different backends to save a ton of CI time. I also had to make some minor fixes in the examples to get everything working, but I have ensured that major example changes are not present in this PR.

examples/sample_to_test.py Show resolved Hide resolved
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure if I should add these datasets in or if I should simply exclude these examples from testing?

@lululxvi
Copy link
Owner

Yes, this PR is huge. Could you first make a PR that changes epochs to iterations in model.train()? So I can merge it immediately.

@Saransh-cpp
Copy link
Contributor Author

Oh yes, I can split this PR.

@Saransh-cpp Saransh-cpp marked this pull request as ready for review September 15, 2023 21:22
@lululxvi
Copy link
Owner

  • The current build tests many examples, which takes a long time to test. Also, more examples will be added in the future.
  • On another hand, when we add more examples or revise existing examples, we may change the sample_to_test.py, which is OK but requires more work.

So, I would suggest still only testing import. If we need to test the examples, we can run the Makefile manually. What do you think?

@Saransh-cpp
Copy link
Contributor Author

Yes, the extra time required to run all the examples is definitely not good, and I have been worried about that since the start of my work on this (which is why I decided to run all jobs in parallel, but there are only a limited number of machines available from GitHub, making this not "completely" parallel).

Testing locally sounds great. I can periodically check if the Makefile is working locally and fix it if it goes stale.

I will revert the changes done to the workflow file and test import on all backends sequentially (parallelly on OS and Python versions).

@lululxvi lululxvi merged commit 0b518c6 into lululxvi:master Sep 18, 2023
14 checks passed
@Saransh-cpp Saransh-cpp deleted the test-suite branch September 19, 2023 02:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants