Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Skip test scenarios at runtime #397

Merged
merged 4 commits into from
Oct 7, 2024

Conversation

sschulz92
Copy link
Contributor

With this PR I would like to introduce the feature to skip scenarios at runtime, related to this issue: #390

I've two question though:

How to test this kind of processing? I was unable to find a great way to test that this exception is raised and handled correctly.
How can I set the step result to skipped=true? At the moment the feature works fine, but the step with the SkipScenarioException is shown as "passed" in the HTML report.

PS: Now with signed commit .. 😏

Signed-off-by: sschulz92 <bastie92_spam@gmx.de>
@chadlwilson chadlwilson linked an issue Oct 1, 2024 that may be closed by this pull request
@sschulz92 sschulz92 changed the title Skip test scenarions at runtime Skip test scenarios at runtime Oct 1, 2024
@chadlwilson
Copy link
Contributor

I'm not deeply familiar with the implementation but I think what you've done should work.

Are you saying that functionally it skips the scenario as you'd expect, but it doesn't show up as skipped in the overall report statistics? or it shows up as skipped from a scenario perspective in the counts, but the individual step which issued the programmatic skip isn't marked as such?

@chadlwilson
Copy link
Contributor

chadlwilson commented Oct 2, 2024

How to test this kind of processing? I was unable to find a great way to test that this exception is raised and handled correctly.

Other than unit testing the expected results when exception raised, the "proper" way would have been for us to add such a scenario into one of the Gauge functional tests (a Java Gauge project is used to test Gauge and various plugins together), and enable it for certain languages using the tags:

e.g similar to
https://github.com/getgauge/gauge-tests/blob/master/specs/advanced_topics/continue_on_failure/continue_on_failure.spec

I'm also not deeply familiar with these but believe the Java code at https://github.com/getgauge/gauge-tests/blob/master/src/test/java/com/thoughtworks/gauge/test/common/PythonProject.java would need to be enhanced to tell the tests how to generate a step that throws your exception etc. Sadly this wasn't done for dotnet/csharp so no pattern to follow :-/

For the basic unit tests, we should be able to add a test step impl that throws your exception after after

def test_Processor_failed_execute_step_request_with_continue_on_failure(self):
registry.add_step('Step 4', failing_impl, '')
registry.continue_on_failure(failing_impl, [IndexError])
request = ExecuteStepRequest()
request.parsedStepText = 'Step 4'
response = processor.process_execute_step_request(request)
self.assertIsInstance(response, ExecutionStatusResponse)
self.assertTrue(response.executionResult.failed)
self.assertEqual(ProtoExecutionResult.ASSERTION,
response.executionResult.errorType)
self.assertNotEqual('', response.executionResult.errorMessage)
self.assertNotEqual('', response.executionResult.stackTrace)
self.assertTrue(response.executionResult.recoverableError)

@gaugebot
Copy link

gaugebot bot commented Oct 2, 2024

@sschulz92 Thank you for contributing to gauge-python. Your pull request has been labeled as a release candidate 🎉🎉.

Merging this PR will trigger a release.

Please bump up the version as part of this PR.

Instructions to bump the version can found at CONTRIBUTING.md

If the CONTRIBUTING.md file does not exist or does not include instructions about bumping up the version, please looks previous commits in git history to see what changes need to be done.

@chadlwilson
Copy link
Contributor

chadlwilson commented Oct 2, 2024

Signed-off-by: Chad Wilson <chadw@thoughtworks.com>
@chadlwilson chadlwilson requested a review from sriv October 2, 2024 03:54
@chadlwilson
Copy link
Contributor

Hello @PiotrNestor - would you care to weigh in on whether you feel this is the right way to approach extending your feature to Python?

If you can comment on whether the step that threw the exception is expected to be displayed/highlighted in the report (possibly not on current impl which is at scenario level?) that'd also be useful.

@sschulz92
Copy link
Contributor Author

I'm not deeply familiar with the implementation but I think what you've done should work.

Are you saying that functionally it skips the scenario as you'd expect, but it doesn't show up as skipped in the overall report statistics? or it shows up as skipped from a scenario perspective in the counts, but the individual step which issued the programmatic skip isn't marked as such?

I can see the statistics being absolutely correct so the scenarios are counted as "skipped" though the step with the SkipScenarioException is shown as passed. The following steps are - correctly - shown as not executed.

@chadlwilson
Copy link
Contributor

I'm not deeply familiar with the implementation but I think what you've done should work.
Are you saying that functionally it skips the scenario as you'd expect, but it doesn't show up as skipped in the overall report statistics? or it shows up as skipped from a scenario perspective in the counts, but the individual step which issued the programmatic skip isn't marked as such?

I can see the statistics being absolutely correct so the scenarios are counted as "skipped" though the step with the SkipScenarioException is shown as passed. The following steps are - correctly - shown as not executed.

Ahh right. I think that might be expected, as this feature rode on top of earlier Gauge capability to skip a scenario declaratively, and I dont see any changes made to the html-report plugin to specifically accommodate this. There is some existing skipped flag in the StepExecutionResult (higher level) however the newer SkipScenario flag in the ExecutionResult that comes from plugins does not seem to be used anywhere in the html-report.

if you'd like to suggest a change for this in the report, I think that's reasonable? I think we'd just need to change
https://github.com/getgauge/html-report/blob/19ee5a6ba70c454958bcdffa9b62c9675d20dabf/generator/transform.go#L619-L630 and to consider whether we need to distinguish visually between a step that programmatically skips, vs other reasons a step could be skipped.

Signed-off-by: sschulz92 <bastie92_spam@gmx.de>
@chadlwilson
Copy link
Contributor

Given this works, I'm going to err on the side of merging this and proceed. It's easy to follow up on the gauge-wide functional tests or the html-report after merging anyway, so let's do this 👍

@chadlwilson chadlwilson merged commit 8ed4e4a into getgauge:master Oct 7, 2024
15 checks passed
@chadlwilson
Copy link
Contributor

https://github.com/getgauge/gauge-python/releases/tag/v0.4.7

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging this pull request may close these issues.

Add support to skip scenarios on runtime
2 participants