Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test: Add test for iterator exhaustion with next #2273

Merged
merged 1 commit into from
Aug 10, 2023

Conversation

matthewfeickert
Copy link
Member

@matthewfeickert matthewfeickert commented Aug 10, 2023

Description

Checklist Before Requesting Reviewer

  • Tests are passing
  • "WIP" removed from the title of the pull request
  • Selected an Assignee for the PR to be responsible for the log summary

Before Merging

For the PR Assignees:

  • Summarize commit messages into a comprehensive review of the PR
* Add test to tests/contrib/test_viz.py to cover the case where the
  wrong labels are somehow returned in brazil.plot_results, causing the
  legend label ordering logic to fail.
* Amends PR #2264

* Add test to tests/contrib/test_viz.py to cover the case where the
  wrong labels are somehow returned in brazil.plot_results, causing the
  legend label ordering logic to fail.
* Amends PR # 2264
@matthewfeickert matthewfeickert added tests pytest contrib Targeting pyhf.contrib and not the core of pyhf labels Aug 10, 2023
@matthewfeickert matthewfeickert self-assigned this Aug 10, 2023
Comment on lines +197 to +203
get_legend_handles_labels = mocker.patch(
"matplotlib.axes._axes.Axes.get_legend_handles_labels",
return_value=(None, ["fail"]),
)

with pytest.raises(StopIteration):
brazil.plot_results(data["testmus"], data["results"], test_size=0.05, ax=ax)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This really just causes the error to cover the iterator exhaustion case for next and so is somewhat of a bad test, but we'd need to add a custom exception here otherwise and I'm not sure if that is worth it.

@codecov
Copy link

codecov bot commented Aug 10, 2023

Codecov Report

Patch coverage has no change and project coverage change: +0.02% 🎉

Comparison is base (6f8ad2e) 98.27% compared to head (4151939) 98.30%.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #2273      +/-   ##
==========================================
+ Coverage   98.27%   98.30%   +0.02%     
==========================================
  Files          69       69              
  Lines        4534     4534              
  Branches      801      801              
==========================================
+ Hits         4456     4457       +1     
  Misses         45       45              
+ Partials       33       32       -1     
Flag Coverage Δ
contrib 97.88% <ø> (+0.02%) ⬆️
doctest 61.07% <ø> (ø)
unittests-3.10 96.31% <ø> (ø)
unittests-3.11 96.31% <ø> (ø)
unittests-3.8 96.33% <ø> (ø)
unittests-3.9 96.36% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

see 1 file with indirect coverage changes

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@kratsg kratsg merged commit 852a224 into main Aug 10, 2023
19 checks passed
@kratsg kratsg deleted the test/add-coverage-for-use-of-next-failing-on-iterator branch August 10, 2023 17:09
@kratsg
Copy link
Contributor

kratsg commented Aug 10, 2023

I'm ok with the test looking like this, since this doesn't necessarily mean an impact in actual user-facing code.

matthewfeickert added a commit that referenced this pull request Aug 16, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contrib Targeting pyhf.contrib and not the core of pyhf tests pytest
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

2 participants