-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Skip test scenarios at runtime #397
Conversation
Signed-off-by: sschulz92 <bastie92_spam@gmx.de>
I'm not deeply familiar with the implementation but I think what you've done should work. Are you saying that functionally it skips the scenario as you'd expect, but it doesn't show up as skipped in the overall report statistics? or it shows up as skipped from a scenario perspective in the counts, but the individual step which issued the programmatic skip isn't marked as such? |
Other than unit testing the expected results when exception raised, the "proper" way would have been for us to add such a scenario into one of the Gauge functional tests (a Java Gauge project is used to test Gauge and various plugins together), and enable it for certain languages using the tags: e.g similar to I'm also not deeply familiar with these but believe the Java code at https://github.com/getgauge/gauge-tests/blob/master/src/test/java/com/thoughtworks/gauge/test/common/PythonProject.java would need to be enhanced to tell the tests how to generate a step that throws your exception etc. Sadly this wasn't done for dotnet/csharp so no pattern to follow :-/ For the basic unit tests, we should be able to add a test step impl that throws your exception after after gauge-python/tests/test_processor.py Lines 223 to 238 in 1f70365
|
@sschulz92 Thank you for contributing to gauge-python. Your pull request has been labeled as a release candidate 🎉🎉. Merging this PR will trigger a release. Please bump up the version as part of this PR.Instructions to bump the version can found at CONTRIBUTING.md If the CONTRIBUTING.md file does not exist or does not include instructions about bumping up the version, please looks previous commits in git history to see what changes need to be done. |
Signed-off-by: Chad Wilson <chadw@thoughtworks.com>
ba5f983
to
a3919f0
Compare
Hello @PiotrNestor - would you care to weigh in on whether you feel this is the right way to approach extending your feature to Python? If you can comment on whether the step that threw the exception is expected to be displayed/highlighted in the report (possibly not on current impl which is at scenario level?) that'd also be useful. |
I can see the statistics being absolutely correct so the scenarios are counted as "skipped" though the step with the SkipScenarioException is shown as passed. The following steps are - correctly - shown as not executed. |
Ahh right. I think that might be expected, as this feature rode on top of earlier Gauge capability to skip a scenario declaratively, and I dont see any changes made to the html-report plugin to specifically accommodate this. There is some existing if you'd like to suggest a change for this in the report, I think that's reasonable? I think we'd just need to change |
Signed-off-by: sschulz92 <bastie92_spam@gmx.de>
Given this works, I'm going to err on the side of merging this and proceed. It's easy to follow up on the gauge-wide functional tests or the html-report after merging anyway, so let's do this 👍 |
With this PR I would like to introduce the feature to skip scenarios at runtime, related to this issue: #390
I've two question though:
How to test this kind of processing? I was unable to find a great way to test that this exception is raised and handled correctly.
How can I set the step result to skipped=true? At the moment the feature works fine, but the step with the SkipScenarioException is shown as "passed" in the HTML report.
PS: Now with signed commit .. 😏