Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Test failure with pytest 7: test_skip_with_failure_and_non_subskip[pytest-normal] #181

Closed
musicinmybrain opened this issue Dec 26, 2024 · 4 comments · Fixed by #182
Closed

Comments

@musicinmybrain
Copy link
Contributor

While the plugin advertises support for pytest 7.4 and later,

install_requires =
attrs>=19.2.0
pytest>=7.4

in practice, test_skip_with_failure_and_non_subskip[pytest-normal] fails with pytest 7.

You can reproduce this by temporarily modifying tox.ini

diff --git a/tox.ini b/tox.ini
index 43c01ac..3906a11 100644
--- a/tox.ini
+++ b/tox.ini
@@ -4,6 +4,7 @@ envlist = py39,py310,py311,py312,py313
 [testenv]
 deps =
     pytest-xdist>=3.3.0
+    pytest<8
 
 commands =
     pytest {posargs:tests}

and then running tox:

FAILED tests/test_subtests.py::TestSubTest::test_skip_with_failure_and_non_subskip[pytest-normal] - Failed: remains unmatched: 'test_skip_with_failure_and_non_subskip.py::T::test_foo \\[custom message\\] \\(i=4\\) SUBFAIL .*'
============================================================================== 1 failed, 31 passed, 7 xfailed, 1 warning in 2.19s ==============================================================================
@nicoddemus
Copy link
Member

Thanks @musicinmybrain

But failing the test suite does not mean the plugin will fail to work with pytest 7.4, or does it?

@musicinmybrain
Copy link
Contributor Author

Thanks @musicinmybrain

But failing the test suite does not mean the plugin will fail to work with pytest 7.4, or does it?

I came across this in the process of helping to update the python-pytest-subtests package in Fedora and backport it to EPEL10, which has pytest 7. The test failed and I had to skip it, so I reported it.

I’m not sure if the test failure reflects a significant observable difference for users or not. I’m not really deeply familiar with the plugin and what to expect from it, so it’s hard for me to know what I’m looking at.

The detailed output looks like:

$ tox -e py313
.pkg: _optional_hooks> python /usr/lib/python3.13/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: get_requires_for_build_sdist> python /usr/lib/python3.13/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: get_requires_for_build_wheel> python /usr/lib/python3.13/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: prepare_metadata_for_build_wheel> python /usr/lib/python3.13/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: build_sdist> python /usr/lib/python3.13/site-packages/pyproject_api/_backend.py True setuptools.build_meta
py313: install_package> python -I -m pip install --force-reinstall --no-deps /home/ben/src/forks/pytest-subtests/.tox/.tmp/package/11/pytest_subtests-0.14.2.dev5+g5a9dc3f.d20241226.tar.gz
py313: commands[0]> pytest tests
============================================================================================= test session starts ==============================================================================================
platform linux -- Python 3.13.0, pytest-7.4.4, pluggy-1.5.0
cachedir: .tox/py313/.pytest_cache
rootdir: /home/ben/src/forks/pytest-subtests
configfile: pytest.ini
plugins: xdist-3.6.1, subtests-0.14.2.dev5+g5a9dc3f.d20241226
collected 39 items                                                                                                                                                                                             

tests/test_subtests.py .................xxxxx...x.Fx..........                                                                                                                                           [100%]

=================================================================================================== FAILURES ===================================================================================================
______________________________________________________________________ TestSubTest.test_skip_with_failure_and_non_subskip[pytest-normal] _______________________________________________________________________

self = <test_subtests.TestSubTest object at 0x7f9e3731edb0>, pytester = <Pytester PosixPath('/tmp/pytest-of-ben/pytest-55/test_skip_with_failure_and_non_subskip1')>
monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x7f9e34f56ac0>, runner = 'pytest-normal'

    @pytest.mark.parametrize("runner", ["unittest", "pytest-normal", "pytest-xdist"])
    def test_skip_with_failure_and_non_subskip(
        self,
        pytester: pytest.Pytester,
        monkeypatch: pytest.MonkeyPatch,
        runner: Literal["unittest", "pytest-normal", "pytest-xdist"],
    ) -> None:
        monkeypatch.setenv("COLUMNS", "200")
        p = pytester.makepyfile(
            """
            import pytest
            from unittest import expectedFailure, TestCase, main

            class T(TestCase):
                def test_foo(self):
                    for i in range(10):
                        with self.subTest("custom message", i=i):
                            if i < 4:
                                self.skipTest(f"skip subtest i={i}")
                            assert i < 4
                    self.skipTest(f"skip the test")

            if __name__ == '__main__':
                main()
        """
        )
        if runner == "unittest":
            result = pytester.runpython(p)
            if sys.version_info < (3, 11):
                result.stderr.re_match_lines(
                    [
                        r"FAIL: test_foo \(__main__\.T\) \[custom message\] \(i=4\).*",
                        r"FAIL: test_foo \(__main__\.T\) \[custom message\] \(i=9\).*",
                        r"Ran 1 test in .*",
                        r"FAILED \(failures=6, skipped=5\)",
                    ]
                )
            else:
                result.stderr.re_match_lines(
                    [
                        r"FAIL: test_foo \(__main__\.T\.test_foo\) \[custom message\] \(i=4\).*",
                        r"FAIL: test_foo \(__main__\.T\.test_foo\) \[custom message\] \(i=9\).*",
                        r"Ran 1 test in .*",
                        r"FAILED \(failures=6, skipped=5\)",
                    ]
                )
        elif runner == "pytest-normal":   
            result = pytester.runpytest(p, "-v", "-rsf")
            # The `(i=0)` is not correct but it's given by pytest `TerminalReporter` without `--no-fold-skipped`
            result.stdout.re_match_lines( 
                [
                    r"test_skip_with_failure_and_non_subskip.py::T::test_foo \[custom message\] \(i=4\) SUBFAIL .*",
                    r"test_skip_with_failure_and_non_subskip.py::T::test_foo SKIPPED \(skip the test\)",
                    r"\[custom message\] \(i=0\) SUBSKIP \[1\] test_skip_with_failure_and_non_subskip.py:5: skip subtest i=3",
                    r"\[custom message\] \(i=0\) SUBSKIP \[1\] test_skip_with_failure_and_non_subskip.py:5: skip the test",
                    r"\[custom message\] \(i=4\) SUBFAIL test_skip_with_failure_and_non_subskip.py::T::test_foo",
                    r".* 6 failed, 5 skipped in .*",
                ]
            )
            # check with `--no-fold-skipped` (which gives the correct information)
            if sys.version_info >= (3, 10):
                result = pytester.runpytest(p, "-v", "--no-fold-skipped", "-rsf")
>               result.stdout.re_match_lines(
                    [
                        r"test_skip_with_failure_and_non_subskip.py::T::test_foo \[custom message\] \(i=4\) SUBFAIL .*",
                        r"test_skip_with_failure_and_non_subskip.py::T::test_foo SKIPPED \(skip the test\).*",
                        r"\[custom message\] \(i=3\) SUBSKIP test_skip_with_failure_and_non_subskip.py::T::test_foo - Skipped: skip subtest i=3",
                        r"SKIPPED test_skip_with_failure_and_non_subskip.py::T::test_foo - Skipped: skip the test",
                        r"\[custom message\] \(i=4\) SUBFAIL test_skip_with_failure_and_non_subskip.py::T::test_foo",
                        r".* 6 failed, 5 skipped in .*",
                    ]
                )
E               Failed: remains unmatched: 'test_skip_with_failure_and_non_subskip.py::T::test_foo \\[custom message\\] \\(i=4\\) SUBFAIL .*'

/home/ben/src/forks/pytest-subtests/tests/test_subtests.py:497: Failed
--------------------------------------------------------------------------------------------- Captured stdout call ---------------------------------------------------------------------------------------------
========================================================================================= test session starts ==========================================================================================
platform linux -- Python 3.13.0, pytest-7.4.4, pluggy-1.5.0 -- /home/ben/src/forks/pytest-subtests/.tox/py313/bin/python
cachedir: .pytest_cache
rootdir: /tmp/pytest-of-ben/pytest-55/test_skip_with_failure_and_non_subskip1
plugins: xdist-3.6.1, subtests-0.14.2.dev5+g5a9dc3f.d20241226
collecting ... collected 1 item

test_skip_with_failure_and_non_subskip.py::T::test_foo [custom message] (i=0) SUBSKIP (skip subtest i=0)                                                                                         [100%]
test_skip_with_failure_and_non_subskip.py::T::test_foo [custom message] (i=1) SUBSKIP (skip subtest i=1)                                                                                         [100%]
test_skip_with_failure_and_non_subskip.py::T::test_foo [custom message] (i=2) SUBSKIP (skip subtest i=2)                                                                                         [100%]
test_skip_with_failure_and_non_subskip.py::T::test_foo [custom message] (i=3) SUBSKIP (skip subtest i=3)                                                                                         [100%]
test_skip_with_failure_and_non_subskip.py::T::test_foo [custom message] (i=4) SUBFAIL                                                                                                            [100%]
test_skip_with_failure_and_non_subskip.py::T::test_foo [custom message] (i=5) SUBFAIL                                                                                                            [100%]
test_skip_with_failure_and_non_subskip.py::T::test_foo [custom message] (i=6) SUBFAIL                                                                                                            [100%]
test_skip_with_failure_and_non_subskip.py::T::test_foo [custom message] (i=7) SUBFAIL                                                                                                            [100%]
test_skip_with_failure_and_non_subskip.py::T::test_foo [custom message] (i=8) SUBFAIL                                                                                                            [100%]
test_skip_with_failure_and_non_subskip.py::T::test_foo [custom message] (i=9) SUBFAIL                                                                                                            [100%]
test_skip_with_failure_and_non_subskip.py::T::test_foo SKIPPED (skip the test)                                                                                                                   [100%]

=============================================================================================== FAILURES ===============================================================================================
__________________________________________________________________________________ T.test_foo [custom message] (i=4) ___________________________________________________________________________________

self = <test_skip_with_failure_and_non_subskip.T testMethod=test_foo>

    def test_foo(self):
        for i in range(10):
            with self.subTest("custom message", i=i):
                if i < 4:
                    self.skipTest(f"skip subtest i={i}")
>               assert i < 4
E               AssertionError: assert 4 < 4

test_skip_with_failure_and_non_subskip.py:10: AssertionError
__________________________________________________________________________________ T.test_foo [custom message] (i=5) ___________________________________________________________________________________

self = <test_skip_with_failure_and_non_subskip.T testMethod=test_foo>

    def test_foo(self):
        for i in range(10):
            with self.subTest("custom message", i=i):
                if i < 4:
                    self.skipTest(f"skip subtest i={i}")
>               assert i < 4
E               AssertionError: assert 5 < 4

test_skip_with_failure_and_non_subskip.py:10: AssertionError
__________________________________________________________________________________ T.test_foo [custom message] (i=6) ___________________________________________________________________________________

self = <test_skip_with_failure_and_non_subskip.T testMethod=test_foo>

    def test_foo(self):
        for i in range(10):
            with self.subTest("custom message", i=i):
                if i < 4:
                    self.skipTest(f"skip subtest i={i}")
>               assert i < 4
E               AssertionError: assert 6 < 4

test_skip_with_failure_and_non_subskip.py:10: AssertionError
__________________________________________________________________________________ T.test_foo [custom message] (i=7) ___________________________________________________________________________________

self = <test_skip_with_failure_and_non_subskip.T testMethod=test_foo>

    def test_foo(self):
        for i in range(10):
            with self.subTest("custom message", i=i):
                if i < 4:
                    self.skipTest(f"skip subtest i={i}")
>               assert i < 4
E               AssertionError: assert 7 < 4

test_skip_with_failure_and_non_subskip.py:10: AssertionError
__________________________________________________________________________________ T.test_foo [custom message] (i=8) ___________________________________________________________________________________

self = <test_skip_with_failure_and_non_subskip.T testMethod=test_foo>

    def test_foo(self):
        for i in range(10):
            with self.subTest("custom message", i=i):
                if i < 4:
                    self.skipTest(f"skip subtest i={i}")
>               assert i < 4
E               AssertionError: assert 8 < 4

test_skip_with_failure_and_non_subskip.py:10: AssertionError
__________________________________________________________________________________ T.test_foo [custom message] (i=9) ___________________________________________________________________________________

self = <test_skip_with_failure_and_non_subskip.T testMethod=test_foo>

    def test_foo(self):
        for i in range(10):
            with self.subTest("custom message", i=i):
                if i < 4:
                    self.skipTest(f"skip subtest i={i}")
>               assert i < 4
E               AssertionError: assert 9 < 4

test_skip_with_failure_and_non_subskip.py:10: AssertionError
======================================================================================= short test summary info ========================================================================================
[custom message] (i=0) SUBSKIP [1] test_skip_with_failure_and_non_subskip.py:5: skip subtest i=0
[custom message] (i=0) SUBSKIP [1] test_skip_with_failure_and_non_subskip.py:5: skip subtest i=1
[custom message] (i=0) SUBSKIP [1] test_skip_with_failure_and_non_subskip.py:5: skip subtest i=2
[custom message] (i=0) SUBSKIP [1] test_skip_with_failure_and_non_subskip.py:5: skip subtest i=3
[custom message] (i=0) SUBSKIP [1] test_skip_with_failure_and_non_subskip.py:5: skip the test
[custom message] (i=4) SUBFAIL test_skip_with_failure_and_non_subskip.py::T::test_foo - AssertionError: assert 4 < 4
[custom message] (i=5) SUBFAIL test_skip_with_failure_and_non_subskip.py::T::test_foo - AssertionError: assert 5 < 4
[custom message] (i=6) SUBFAIL test_skip_with_failure_and_non_subskip.py::T::test_foo - AssertionError: assert 6 < 4
[custom message] (i=7) SUBFAIL test_skip_with_failure_and_non_subskip.py::T::test_foo - AssertionError: assert 7 < 4
[custom message] (i=8) SUBFAIL test_skip_with_failure_and_non_subskip.py::T::test_foo - AssertionError: assert 8 < 4
[custom message] (i=9) SUBFAIL test_skip_with_failure_and_non_subskip.py::T::test_foo - AssertionError: assert 9 < 4
===================================================================================== 6 failed, 5 skipped in 0.01s =====================================================================================
--------------------------------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------------------------------
ERROR: usage: pytest [options] [file_or_dir] [file_or_dir] [...]
pytest: error: unrecognized arguments: --no-fold-skipped
  inifile: None
  rootdir: /tmp/pytest-of-ben/pytest-55/test_skip_with_failure_and_non_subskip1

=============================================================================================== warnings summary ===============================================================================================
tests/test_subtests.py::TestSubTest::test_skip_with_failure_and_non_subskip[pytest-normal]
  /home/ben/src/forks/pytest-subtests/.tox/py313/lib/python3.13/site-packages/_pytest/config/__init__.py:331: PluggyTeardownRaisedWarning: A plugin raised an exception during an old-style hookwrapper teardown.
  Plugin: helpconfig, Hook: pytest_cmdline_parse
  UsageError: usage: pytest [options] [file_or_dir] [file_or_dir] [...]
  pytest: error: unrecognized arguments: --no-fold-skipped
    inifile: None
    rootdir: /tmp/pytest-of-ben/pytest-55/test_skip_with_failure_and_non_subskip1
  For more information see https://pluggy.readthedocs.io/en/stable/api_reference.html#pluggy.PluggyTeardownRaisedWarning
    config = pluginmanager.hook.pytest_cmdline_parse(

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
=========================================================================================== short test summary info ============================================================================================
XFAIL tests/test_subtests.py::TestSubTest::test_skip[pytest-normal] - reason: Not producing the expected results (#5)
XFAIL tests/test_subtests.py::TestSubTest::test_skip[pytest-xdist] - reason: Not producing the expected results (#5)
XFAIL tests/test_subtests.py::TestSubTest::test_xfail[unittest] - Not producing the expected results (#5)
XFAIL tests/test_subtests.py::TestSubTest::test_xfail[pytest-normal] - Not producing the expected results (#5)
XFAIL tests/test_subtests.py::TestSubTest::test_xfail[pytest-xdist] - Not producing the expected results (#5)
XFAIL tests/test_subtests.py::TestSubTest::test_skip_with_failure[pytest-xdist] - reason: Not producing the expected results (#5)
XFAIL tests/test_subtests.py::TestSubTest::test_skip_with_failure_and_non_subskip[pytest-xdist] - reason: Not producing the expected results (#5)
FAILED tests/test_subtests.py::TestSubTest::test_skip_with_failure_and_non_subskip[pytest-normal] - Failed: remains unmatched: 'test_skip_with_failure_and_non_subskip.py::T::test_foo \\[custom message\\] \\(i=4\\) SUBFAIL .*'
============================================================================== 1 failed, 31 passed, 7 xfailed, 1 warning in 2.19s ==============================================================================
py313: exit 1 (2.34 seconds) /home/ben/src/forks/pytest-subtests> pytest tests pid=358429
  py313: FAIL code 1 (5.01=setup[2.67]+cmd[2.34] seconds)
  evaluation failed :( (5.06 seconds)

nicoddemus added a commit to nicoddemus/pytest-subtests that referenced this issue Dec 26, 2024
nicoddemus added a commit to nicoddemus/pytest-subtests that referenced this issue Dec 26, 2024
nicoddemus added a commit to nicoddemus/pytest-subtests that referenced this issue Dec 26, 2024
nicoddemus added a commit to nicoddemus/pytest-subtests that referenced this issue Dec 26, 2024
@nicoddemus
Copy link
Member

Ahh I see thanks for the context!

That particular test indeed uses a flag (--no-fold-skipped) which is only available on pytest 8.3. I updated the test and added pytest 7.4 to the build matrix to ensure future compatibility (#182).

Thanks again!

@musicinmybrain
Copy link
Contributor Author

Thank you! With #182, the plugin’s tests all pass for me, both in EPEL10 with pytest 7.4.3 and in the latest Fedora with pytest 8.3.4.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants