-
Notifications
You must be signed in to change notification settings - Fork 14.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Raise an error on Airflow Deprecation warnings in tests #38504
Raise an error on Airflow Deprecation warnings in tests #38504
Conversation
9d44501
to
922d1ac
Compare
0a23e70
to
0d0ce60
Compare
5f4f14b
to
1c3e38b
Compare
Not related to this PR, just for follow up found where in tests we use masking in logs "test". It has a flacky behaviour, and depend on which xdist worker tests comes FAILED tests/providers/google/cloud/hooks/test_dataflow.py::TestDataflow::test_dataflow_wait_for_done_logging - AssertionError: assert 'Running command: test cmd' in ['Running command: *** cmd', 'Start waiting for Apache Beam process to complete.', 'Waiting for Apache Beam process to complete.', '***-stdout', 'Process exited with return code: 1']
= 1 failed, 10732 passed, 8670 skipped, 2 xfailed, 5331 warnings in 403.14s (0:06:43) = |
Easiest fix -> turn it in to a DB test. |
2903608
to
2e6ed9d
Compare
2e6ed9d
to
c8437a8
Compare
One thing what confuse me, that why special tests skipped 🤔 |
We are currently skipping them for regular runs - in one of the recent optimization I changed them to be run only: a) in canary builds I figured that we will note it relatively quickly when once of those special tests are broken (like I did with Pydantic test few moments ago) and that usually it will require maintainer stepping in anyway. We have quite a number of those special tests and it adds more than 2x number of tests for regular PRs (most of them are running only for Python 3.8 so just 4 set of tests) - so it's a really nice trade-off to only run special tests for those "special" PRs and "catch-all" in canary runs. |
so way to trigger the special tests is to add "upgrade to newer dependencies" . Let me do it now. |
BTW. This one #38604 - will turn such "skipped" special tests into a single line in output, rather than showing all the skipped sub-workflows. |
Just realized. Maybe we should add a special if for just "full tests needed" - that would be also a better criteria/way to trigger the special tests. |
Added it to the #38604 - once we merge it, |
…::TestTaskInstance
c8437a8
to
2b696f2
Compare
* POC: Raise an error on Airflow Deprecation warnings in tests * Add fixture for check deprecations * Ignore providers tests per test * Rever changes in test_pod.py * Add categories for tests, and evaluate most of the tests * Exclude TestSetDagRunNote::test_should_respond_200_with_anonymous_user * Check other groups * Ignore FAB / Connexion 2 specific errors * Remove non-compliant DbApiHook deprecations * Disable buitin warning summary and buitify output * Add information about prohibited warnings * Add to ignore additional tests from tests/models/test_taskinstance.py::TestTaskInstance (cherry picked from commit 50a4c95)
Experiment to show how many tests are not handled deprecation warningsAfter spending some time I've collect all tests which raise one of the selected warning during test runs
airflow.exceptions.AirflowProviderDeprecationWarning
airflow.exceptions.RemovedInAirflow3Warning
airflow.utils.context.AirflowContextDeprecationWarning
With some hacks and tricks I've enable validation that this warning not raised (e.g. not captured by
pytest.warn
)Unfortunetly this action require to slightly change current warning collection and use only
warnings.catch_warnings
and defaultwarnings.filterwarnings
/warnings.simplefilter
Some side affect might happen when change one part it affect other parts which do not run into the tests, as result temporary broken
main
, solution for this one pretty simple: make changes as follow up commit / revert commit / add it into the exclusion listtests/deprecations_ignore.yml
(last resort)Simple snippet for collect list of test from pytest error summary
^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named
{pr_number}.significant.rst
or{issue_number}.significant.rst
, in newsfragments.