-
Notifications
You must be signed in to change notification settings - Fork 192
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Replace requests_mock with responses #2165
Conversation
f65224c
to
4777aca
Compare
Codecov Report
@@ Coverage Diff @@
## dev #2165 +/- ##
=======================================
Coverage 71.99% 71.99%
=======================================
Files 78 78
Lines 8385 8385
=======================================
Hits 6037 6037
Misses 2348 2348 Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
Python linting (
|
418399c
to
aa770e0
Compare
Responses actually throws an error if all the mocks aren't used. That allowed me to clean up a few tests that weren't actually using the mocks and find a few.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good! Thank you! I left some comments out of curiosity 😄
Also, to add to the discussion, today we were talking with @fabianegli about adding pytest fixtures to modularize the set_up() for different tests, which might be related with this mocking, I think both can complement each other, but maybe Fabian can develop more on his idea.
with requests_mock.Mocker() as mock: | ||
subworkflow_create = nf_core.subworkflows.SubworkflowCreate(root_dir, "test_subworkflow", "@author", True) | ||
subworkflow_create.create() | ||
# TODO Add a mock here |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
was there an error?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There wasn't an error, but responses throws an error if the mock doesn't get used (meaning it's not needed) and when I added the mock calls neither of them were used. So I just removed the mock there for now.
module_create = nf_core.modules.ModuleCreate( | ||
self.pipeline_dir, "trimgalore", "@author", "process_single", True, True, conda_name="trim-galore" | ||
) | ||
module_create.create() | ||
with requests_cache.disabled(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Which cache is this affecting? Wondering if it ignores the cloning of the modules repository and then we have to clone it again for every test, but it doesn't seem so when I try to run the test :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, it's not the creation of that, it's just any queries that are running inside that code. For example, if you try to hit anaconda here. It was masking the mock and what I was stuck on months ago because the "mock wasn't used" because it was using the cache.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see, that's interesting!
Looking forward to it! I think responses has some cool features, just wanted to keep this PR 1-to-1 with the current implementation! |
I'll wait to open the PR until this one is merged. It will save some time otherwise spent on merge conflict resolution. Sorry for the wait, @emiller88 |
PR checklist
CHANGELOG.md
is updateddocs
is updatedJust some yak shaving,
responses
has some nice features that might improve testing in the future such as recording responses to files.