-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reimplement and enhance the Perl-based (unit.pl
) unit test control script in Python
#2717
Comments
unit.pl
) unit test control script in pythonunit.pl
) unit test control script in Python
@natalieb-noaa during our weekly MET development meeting today, we reviewed the in-progress issues. I recall seeing an email from you about this one saying that you're making good progress. I'm very glad to hear it! Perhaps we should meet to review in the near future? But please consider adding a comment to this issue to document your progress. Thanks! |
This is ~80% done: build_test function is done, repl_env function is done, and the main (unit) function is mostly done. I've done prelim testing on what's done so far and getting the expected results. I'll still need to add some error checks and logging. |
The script can now be run from the command line. |
Added logging. |
@natalieb-noaa thanks for noting your progress with comments on this issue. I'm wondering if you have an ETA for a Pull Request to the develop branch with these changes? The beta4 development release is currently scheduled for mid-April, and it'd be nice to check this initial development effort off the list. |
@JohnHalleyGotway I should be able to submit a PR for this mid-next week, assuming I don't uncover any major issues when I do some more testing compared to the perl script. |
@natalieb-noaa how close are you to a PR for this work? Do you expect it by the beta4 release on April 10th? |
All the pieces of the perl script have been ported over to the python module now, including all the checks on the output files. But need to debug some issues when running those checks. @JohnHalleyGotway I'm close, but I'm getting some errors when checking the outputs, and still need to debug. Unfortunately, I don't expect I'll be able to address that in time for the 4/10 release. |
@natalieb-noaa thanks for the update. I just reassigned this issue from the beta4 development cycle to the beta5 one. Please feel free to join one of the upcoming MET development meetings if you'd like to talk through the debugging... and definitely once the PR is submitted so that all the developers can hear about the changes. |
fixed the issue described above and made some related improvements. but encountering some issues running commands for other tests that i still need to work through (related to using subprocess.run() to run the commands, as they are formatted in the xml files) |
fixed issue with running commands. ran several tests, and got the same results as the perl script. just need to clean up the code a bit. |
python module is complete and able to replicate all functionality of the perl script, as outlined in (1) in issue description. tested by running ~15 different tests using both perl/python scripts and got same results/output (though slight differences in formatting). also tested each of the run options on 1 test xml, and got expected results. ready to submit PR. @JohnHalleyGotway , who should review the PR? (and anything else i should know regarding the PR/merge process? i've taken a look at the github workflow instructions in the metplus contributor's guide.) |
Great! Glad to hear about your progress. I'd say asking @georgemccabe and myself (@JohnHalleyGotway) to review the PR would be good. You'll submit the PR to merge into the develop branch. |
thanks, @JohnHalleyGotway . in regard to this having the potential to change things upon merge... this PR just has the addition of the python version of the unit testing script, but the testing process has not been updated yet to run it in the place of the perl script, so the merge shouldn't change anything. |
Describe the Enhancement
The existing testing infrastructure for MET uses Perl to run the unit tests defined in each XML file. This task is to reimplement the
unit.pl
script in Python and also enhance it, as described below.Recommend using the
lxml
Python package for parsing tests from the existing XML files since that's being used elsewhere within METplus.2. Enhance this functionality to track and store the runtime for each test executed. Write it somewhere so that it can be easily compared from one run to the next.
3. Investigate whether memory usage can reasonably be tracked and compared in a similar way. This is desired, not required.
4. Update associated scripts in the MET repository to full replace calls to
unit.pl
with the python implementation.5. Remove the existing
unit.pl
from the MET repo. Also inspect the perl directory and remove other scripts that are not actually being used. Ideally, the entire perl directory would be removed for the MET-12.0.0 release.Time Estimate
2 weeks? But please update as needed.
Sub-Issues
Consider breaking the enhancement down into sub-issues.
None needed at this time.
Relevant Deadlines
List relevant project deadlines here or state NONE.
Funding Source
Define the source of funding and account keys here or state NONE.
Define the Metadata
Assignee
Labels
Milestone and Projects
Define Related Issue(s)
Consider the impact to the other METplus components.
No direct impacts to other repositories at this time.
Enhancement Checklist
See the METplus Workflow for details.
Branch name: feature_2717_convert_unit.pl_to_unit.py
Pull request:
feature <Issue Number> <Description>
Select: Reviewer(s) and Development issue
Select: Milestone as the next official version
Select: MET-X.Y.Z Development project for development toward the next official release
The text was updated successfully, but these errors were encountered: