Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support of newer pypy versions #92

Open
15r10nk opened this issue Nov 7, 2024 · 0 comments
Open

support of newer pypy versions #92

15r10nk opened this issue Nov 7, 2024 · 0 comments

Comments

@15r10nk
Copy link
Collaborator

15r10nk commented Nov 7, 2024

executing was probably in the past tested with pypy-3.5 and pypy-3.6 (tox.ini).
But there are problems with pypy-3.8/3.9/3.10 (I was not able to test 3.7).

❯ tox -e pypy310
.pkg: _optional_hooks> python /home/frank/.local/share/uv/tools/tox/lib/python3.12/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: get_requires_for_build_sdist> python /home/frank/.local/share/uv/tools/tox/lib/python3.12/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: get_requires_for_build_wheel> python /home/frank/.local/share/uv/tools/tox/lib/python3.12/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: prepare_metadata_for_build_wheel> python /home/frank/.local/share/uv/tools/tox/lib/python3.12/site-packages/pyproject_api/_backend.py True setuptools.build_meta
.pkg: build_sdist> python /home/frank/.local/share/uv/tools/tox/lib/python3.12/site-packages/pyproject_api/_backend.py True setuptools.build_meta
pypy310: install_package> /home/frank/.local/share/uv/tools/tox/bin/uv pip install --reinstall --no-deps executing@/home/frank/projects/executing/.tox/.tmp/package/23/executing-2.1.1.dev3+g3f11fdc.tar.gz
pypy310: commands[0]> pytest tests
=============================================================================================== test session starts ===============================================================================================
platform linux -- Python 3.10.14[pypy-7.3.17-final], pytest-8.3.3, pluggy-1.5.0
cachedir: .tox/pypy310/.pytest_cache
rootdir: /home/frank/projects/executing
configfile: pyproject.toml
collected 208 items                                                                                                                                                                                               

tests/test_ipython.py ..                                                                                                                                                                                    [  0%]
tests/test_main.py ..................ss..................s...........x.F.x...x..................................................................x..x......................................x.F.........sssss [ 89%]
ssssssssss.                                                                                                                                                                                                 [ 94%]
tests/test_pytest.py ...........                                                                                                                                                                            [100%]

==================================================================================================== FAILURES =====================================================================================================
_____________________________________________________________ test_small_samples[46597f8f896f11c5d7f432236344cc7e5645c2a39836eb6abdd2437c0422f0f4.py] _____________________________________________________________

full_filename = '/home/frank/projects/executing/tests/small_samples/46597f8f896f11c5d7f432236344cc7e5645c2a39836eb6abdd2437c0422f0f4.py'
result_filename = '/home/frank/projects/executing/tests/sample_results/46597f8f896f11c5d7f432236344cc7e5645c2a39836eb6abdd2437c0422f0f4-pypy-3.10.json'

    @pytest.mark.parametrize(
        "full_filename,result_filename", list(sample_files("small_samples"))
    )
    @pytest.mark.skipif(sys.version_info<(3,),reason="no 2.7 support")
    def test_small_samples(full_filename, result_filename):
        skip_sentinel = [
            "load_deref",
            "4851dc1b626a95e97dbe0c53f96099d165b755dd1bd552c6ca771f7bca6d30f5",
            "508ccd0dcac13ecee6f0cea939b73ba5319c780ddbb6c496be96fe5614871d4a",
            "fc6eb521024986baa84af2634f638e40af090be4aa70ab3c22f3d022e8068228",
            "42a37b8a823eb2e510b967332661afd679c82c60b7177b992a47c16d81117c8a",
            "206e0609ff0589a0a32422ee902f09156af91746e27157c32c9595d12072f92a",
        ]
    
        skip_annotations = [
            "d98e27d8963331b58e4e6b84c7580dafde4d9e2980ad4277ce55e6b186113c1d",
            "9b3db37076d3c7c76bdfd9badcc70d8047584433e1eea89f45014453d58bbc43",
        ]
    
        if any(s in full_filename for s in skip_sentinel) and sys.version_info < (3, 11):
            pytest.xfail("SentinelNodeFinder does not find some of the nodes (maybe a bug)")
    
        if any(s in full_filename for s in skip_annotations) and sys.version_info < (3, 7):
            pytest.xfail("no `from __future__ import annotations`")
    
        if (
            (sys.version_info[:2] == (3, 7))
            and "ad8aa993e6ee4eb5ee764d55f2e3fd636a99b2ecb8c5aff2b35fbb78a074ea30"
            in full_filename
        ):
            pytest.xfail("(i async for i in arange) can not be analyzed in 3.7")
    
        if (
            (sys.version_info[:2] == (3, 5) or PYPY)
            and "1656dc52edd2385921104de7bb255ca369713f4b8c034ebeba5cf946058109bc"
            in full_filename
        ):
            pytest.skip("recursion takes to long in 3.5")
    
>       TestFiles().check_filename(full_filename, check_names=True)

tests/test_main.py:756: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/test_main.py:893: in check_filename
    result = list(self.check_code(code, nodes, decorators, check_names=check_names))
tests/test_main.py:1248: in check_code
    ex = Source.executing(frame)
executing/executing.py:273: in executing
    node_finder = NodeFinder(frame, stmts, tree, lasti, source)
executing/executing.py:624: in __init__
    self.result = only(matching)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

it = []

    def only(it):
        # type: (Iterable[T]) -> T
        if isinstance(it, Sized):
            if len(it) != 1:
>               raise NotOneValueFound('Expected one value, found %s' % len(it))
E               executing.executing.NotOneValueFound: Expected one value, found 0

executing/executing.py:110: NotOneValueFound
---------------------------------------------------------------------------------------------- Captured stdout call -----------------------------------------------------------------------------------------------
check /home/frank/projects/executing/tests/small_samples/46597f8f896f11c5d7f432236344cc7e5645c2a39836eb6abdd2437c0422f0f4.py
mapping failed
Expected one value, found 0
search bytecode Instruction(opname='LOAD_METHOD', opcode=160, arg=160, argval='command', argrepr='command', offset=892, starts_line=None, is_jump_target=False)
in file /home/frank/projects/executing/tests/small_samples/46597f8f896f11c5d7f432236344cc7e5645c2a39836eb6abdd2437c0422f0f4.py
_____________________________________________________________ test_small_samples[22bc344a43584c051d8962116e8fd149d72e7e68bcb54caf201ee6e78986b167.py] _____________________________________________________________

full_filename = '/home/frank/projects/executing/tests/small_samples/22bc344a43584c051d8962116e8fd149d72e7e68bcb54caf201ee6e78986b167.py'
result_filename = '/home/frank/projects/executing/tests/sample_results/22bc344a43584c051d8962116e8fd149d72e7e68bcb54caf201ee6e78986b167-pypy-3.10.json'

    @pytest.mark.parametrize(
        "full_filename,result_filename", list(sample_files("small_samples"))
    )
    @pytest.mark.skipif(sys.version_info<(3,),reason="no 2.7 support")
    def test_small_samples(full_filename, result_filename):
        skip_sentinel = [
            "load_deref",
            "4851dc1b626a95e97dbe0c53f96099d165b755dd1bd552c6ca771f7bca6d30f5",
            "508ccd0dcac13ecee6f0cea939b73ba5319c780ddbb6c496be96fe5614871d4a",
            "fc6eb521024986baa84af2634f638e40af090be4aa70ab3c22f3d022e8068228",
            "42a37b8a823eb2e510b967332661afd679c82c60b7177b992a47c16d81117c8a",
            "206e0609ff0589a0a32422ee902f09156af91746e27157c32c9595d12072f92a",
        ]
    
        skip_annotations = [
            "d98e27d8963331b58e4e6b84c7580dafde4d9e2980ad4277ce55e6b186113c1d",
            "9b3db37076d3c7c76bdfd9badcc70d8047584433e1eea89f45014453d58bbc43",
        ]
    
        if any(s in full_filename for s in skip_sentinel) and sys.version_info < (3, 11):
            pytest.xfail("SentinelNodeFinder does not find some of the nodes (maybe a bug)")
    
        if any(s in full_filename for s in skip_annotations) and sys.version_info < (3, 7):
            pytest.xfail("no `from __future__ import annotations`")
    
        if (
            (sys.version_info[:2] == (3, 7))
            and "ad8aa993e6ee4eb5ee764d55f2e3fd636a99b2ecb8c5aff2b35fbb78a074ea30"
            in full_filename
        ):
            pytest.xfail("(i async for i in arange) can not be analyzed in 3.7")
    
        if (
            (sys.version_info[:2] == (3, 5) or PYPY)
            and "1656dc52edd2385921104de7bb255ca369713f4b8c034ebeba5cf946058109bc"
            in full_filename
        ):
            pytest.skip("recursion takes to long in 3.5")
    
>       TestFiles().check_filename(full_filename, check_names=True)

tests/test_main.py:756: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/test_main.py:893: in check_filename
    result = list(self.check_code(code, nodes, decorators, check_names=check_names))
tests/test_main.py:1248: in check_code
    ex = Source.executing(frame)
executing/executing.py:273: in executing
    node_finder = NodeFinder(frame, stmts, tree, lasti, source)
executing/executing.py:624: in __init__
    self.result = only(matching)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

it = []

    def only(it):
        # type: (Iterable[T]) -> T
        if isinstance(it, Sized):
            if len(it) != 1:
>               raise NotOneValueFound('Expected one value, found %s' % len(it))
E               executing.executing.NotOneValueFound: Expected one value, found 0

executing/executing.py:110: NotOneValueFound
---------------------------------------------------------------------------------------------- Captured stdout call -----------------------------------------------------------------------------------------------
check /home/frank/projects/executing/tests/small_samples/22bc344a43584c051d8962116e8fd149d72e7e68bcb54caf201ee6e78986b167.py
mapping failed
Expected one value, found 0
search bytecode Instruction(opname='LOAD_METHOD', opcode=160, arg=1, argval='filter_by', argrepr='filter_by', offset=2, starts_line=None, is_jump_target=False)
in file /home/frank/projects/executing/tests/small_samples/22bc344a43584c051d8962116e8fd149d72e7e68bcb54caf201ee6e78986b167.py
============================================================================================= short test summary info =============================================================================================
FAILED tests/test_main.py::test_small_samples[46597f8f896f11c5d7f432236344cc7e5645c2a39836eb6abdd2437c0422f0f4.py] - executing.executing.NotOneValueFound: Expected one value, found 0
FAILED tests/test_main.py::test_small_samples[22bc344a43584c051d8962116e8fd149d72e7e68bcb54caf201ee6e78986b167.py] - executing.executing.NotOneValueFound: Expected one value, found 0
============================================================================== 2 failed, 182 passed, 18 skipped, 6 xfailed in 18.10s ==============================================================================
pypy310: exit 1 (18.62 seconds) /home/frank/projects/executing> pytest tests pid=47765
  pypy310: FAIL code 1 (22.66=setup[4.04]+cmd[18.62] seconds)
  evaluation failed :( (22.74 seconds)

This 15r10nk/inline-snapshot#132 is where I would need pypy support.

@alexmojaki do you know how difficult it would be to use executing with pypy? It looks like pypy support was stopped at some point. Do you know why?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant