-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error on xarray warnings in tests? #7164
Comments
There is still the problem that we support several outdated backends that will raise warnings, like pydap the outdated Version thingy. This will also happen in the min-deps builds. |
Yes, good point, and I should have added above — it's still possible to ignore warnings at the test site explicitly ( |
I am a bit skeptical - we test a large range of versions and the kinds of warnings we get can be quite different. I fear that we get too many false positives and end up wrapping everything with |
Yes, very fair. Though to confirm — this is a concern for the "Warnings from the min-versions test", rather than warnings that xarray issues? Or for both? |
Yes I mean the external ones. I agree that it is a good idea to disallow internal warnings (sometimes PRs introduce them without realizing...). And of course I also agree that we should act when then deprecation warnings are being thrown and not when the stuff is actually removed. |
We now disallow internal warnings, with a list of exceptions, so closing as completed |
What is your issue?
We've done a superb job of cutting the number of warnings in #3266.
On another project I've been spending time with recently, we raise an error on any warnings in the test suite. It's easy mode — the dependencies are locked (it's not python...), but I wonder whether we can do something some of the way with this:
Would it be worth failing on:
main
to fail. When we deprecate something, we'd update calling code with it.main
to failThe text was updated successfully, but these errors were encountered: