-
-
Notifications
You must be signed in to change notification settings - Fork 402
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
module, tests: set up VCR.py
for plugin example tests
#1853
Conversation
cfb201f
to
648dead
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not convinced by this, even if I can see the appeal and reasoning.
My main concern is that this PR adds an untested feature in a hotfix branch. All it adds is a dev dependency and change how Travis run the test, with nothing else. Beside I don't know if the point is to generate cassette files and to publish them on git or not, that's unclear to me. So it leaves a lot to be defined, and, again, I'm not thrilled by that for a hotfix branch.
I don't understand why we can't have a --offline
tag for Travis. That'd cover the same problem with less changes. I say that because we know why tests are failing on Travis:
- network error (random, nothing to do with the code)
- search engine having a hiccup (random, nothing to do with the code)
- search engine changing behavior for good (can't be foreseen)
If someone is bothered by online tests locally, fine: use --offline
.
If we don't want Travis to ever fail on them, fine: use --offline
for it.
By default, we still run all the tests, so it's not like we can't see the third issue (which is the real one we need to be aware off).
What's your end goal here with VCR.py
? How do you want to use it exactly? Why can't we just nuke this goddam search engine tests nobody cares about?
Quick note about dependencies:
vcrpy==2.1.1; python_version == '2.7'
vcrpy<1.12.0; python_version == '3.3'
vcrpy<2.1.0; python_version == '3.4'
vcrpy<3.0.0; python_version >= '3.5'
This is bonkers. That's a big pile of dependency hell for us to handle in the future. To me, it's yet another pile of stuff that can break on Travis, and I'm not sure I like that.
I didn't mean for this to become such a wall of text… Guess I'm trying to cover as many bases as possible in one reply. x)
It's tested locally; I just didn't publish any cassette files with this PR. I debated whether or not to include a single set of example cassette files, and chose not to. Maybe it was the wrong move, because without them it's hard to see what this does? You can still push for delaying this until 7.1, though. Not doing a change like this in a hotfix branch is a totally valid argument! I'm just eager to put this into play.
As intended: To record known-good responses from problematic (eventually, probably all) remote HTTP services/APIs and essentially separate our unit testing from our integration testing. That is, given the expected response, will the plugin produce the expected output? That's the most important thing.
If Travis skips tests, they're not included in coverage reports. But beyond that, using VCR solves the transient errors you described in bullets 1 & 2 by always replaying the exact response we expect for a given test. Bullet 3 (a test failed because the SERP changed) never actually happened, that I can recall. And if bullet 3 does happen, someone will notice and report it. Like I noticed You agreed on IRC that "we should mock the result pages so we're testing the 'parsing' code only, and set up separate alerts outside the test suite for if the markup changes." I just haven't started on that second piece yet, but the VCR cassettes could help with that by giving us a saved baseline for comparison to the live response. We really just need a separate test that fetches the remote resource and compares the live response with the saved one, but skips throwing an error if the mismatch is (for example) a "No results" when we know there will always be at least one result, or a network/HTTP error that isn't the client's fault.
The point of pinning versions for each Python release is so things don't unexpectedly break on Travis. Everything's pinned to a known-good version for all of our supported Python releases, and we can delete most of this mess (and a bunch more across both requirements files) in 8.0 when we ourselves drop EOL Python versions. If you'd be more comfortable with exact versions to even further reduce the possibility of unexpected updates breaking builds, I can use
Never thought you of all people would ever suggest removing tests. 😹 |
Pushing back to 7.1; updating target branch and milestone. Once 7.0.2 is released and merged up to |
Rebased. Also musing that perhaps the new arg for |
* Add new `vcr` kwarg to `module.example` decorator * Considered making this part of `online`, but decided to keep it separate as it would be a pretty big behavior change * Add PyPI requirements for implementing VCR * This involved some trial-and-error, letting Travis flag up incompatibilities on old Python releases I can't install locally * Configure default cassette directory for VCR via new conftest fixture
Known merge conflict with current master in the `search` plugin (applied a fix here that's already applied there). Can merge manually to resolve.
Back on this after having to hotfix a687ab7 today. |
That's very interesting, failing to match requests on Python 3.3. Not sure I'll dig into it just now, but that's weird. |
4432266
to
24ff5f7
Compare
Py3.3 issue fixed with a second conftest fixture. |
Description
vcr
kwarg tomodule.example
decoratoronline
, but decided to keep itseparate as it would be a pretty big behavior change
Checklist
make qa
(runsmake quality
andmake test
)Notes
I'd like to ship this with 7.0.2 and make an exception to our usual policy about not adding new features in maintenance releases. The main reason: It will let me keep more of my hair if we need to release more 7.0.x versions. 😹 Kicking Travis builds is really getting tiresome. We already merge maintenance branches back to
master
after every maintenance release anyway, so this would make its way into 7.1.0-dev right quick (within the first week of May).Keeping in mind that 1) nobody really uses the example-tests outside of Sopel's core plugins and 2) this doesn't change the existing behavior anyway, even for the few projects that might be using them—the benefits of adding this sooner outweigh any potential downsides (IMO, obviously).
It also doesn't add any install dependencies; the only new reqs are in
dev-requirements
.