Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pytest error when running tox #3722

Closed
manoadamro opened this issue Jul 27, 2018 · 6 comments
Closed

pytest error when running tox #3722

manoadamro opened this issue Jul 27, 2018 · 6 comments
Labels
type: bug problem that needs to be addressed

Comments

@manoadamro
Copy link

manoadamro commented Jul 27, 2018

I get this error when running pytest:

py.error.ENOENT: [No such file or directory]: open('/Users/me/Github/the-package/.pytest_cache/v/cache/nodeids', 'w')

This is my tox.ini

[tox]
envlist = py36

[flake8]
max-line-length = 100

[testenv]
passenv = CUSTOM_PYPI_URL

setenv = ENVIRONMENT = DEVELOPMENT
                IGNORE_JWT_VALIDATION = True
                RABBITMQ_NOENCRYPT = false

whitelist_externals =
    bash
    python
    flake8
    coverage
    bandit

deps=-r{toxinidir}/test-requirements.txt

commands = flake8 the_package --doctests --exit-zero --count
           bandit -r the_package -lll
           coverage run --source the_package -m pytest
           coverage report

Using:

  • Python 3.6.4

  • OSX 10.12.6

  • pytest==3.6.3

  • tox==3.1.2

  • bandit==1.4.0

  • flake8==3.5.0

  • coverage==4.5.1

I haven't found any info online about this error, but what makes it confusing is, out of all our micro services that have identical tox commands (except the package name), this is the only one that throws this error :(

This is the code that causes the error:

        try:
            f = path.open("w")  # This is the problem (somehow)
        except py.error.ENOTDIR:
            self.config.warn(
                code="I9", message="cache could not write path %s" % (path,)
            )
        else:
            with f:
                self.trace("cache-write %s: %r" % (key, value))
                json.dump(value, f, indent=2, sort_keys=True)
@pytestbot
Copy link
Contributor

GitMate.io thinks possibly related issues are #3112 (Key errors while running pytest), #2777 (pytest is not running when i run it through tox -- Mac), #3326 (tox and pytest runs can behave differently when "-p something.conftest" is specified), #1591 (pytest-xdist fails when running same tests several times), and #3097 ("unrecognized arguments" or "option names already added" error when running pytest inside a Docker container using Jenkins).

@pytestbot pytestbot added the type: bug problem that needs to be addressed label Jul 27, 2018
@asottile
Copy link
Member

Could you share the package name? Also let's see if we can't factor out tox, does .tox/py36/bin/python -m pytest reproduce this issue?

Also curious what tree .pytest_cache looks like (or any other recursive view of that directory)

@nicoddemus
Copy link
Member

Regardless of the investigation and possible workaround, I think we should catch IOError and OSError in that section of code instead of only ENOTDIR; cache is secondary, so it should not blow up the test suite IMO.

@manoadamro
Copy link
Author

@asottile running .tox/py36/bin/python -m pytest produces the same error

I should probably explain this better,
We have a standard circle config across all our services, and .pytest_cache is part of the .gitignore we use for all of them. CircleCI hasn't cared up until this point, it just runs the tests an moves on.
With this particular one however, dhos_pdf_adapter_worker, CircleCI was throwing this error.
In order to recreate this issue, i removed all the excluded files (including .pytest_cache) by getting a fresh clone, ran tox and got the same error. I have tried this with other repos too and just can't recreate it.

@manoadamro
Copy link
Author

@asottile You know what? I owe you an apology, turns out I had done some dodgy mocking screwed up the os package. sorry for wasting your time.

@asottile
Copy link
Member

ah yeah that would do it! we have some ~rough plans to have better isolation from mocked modules but haven't done anything to solve it yet.

Thanks for the issue nonetheless!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug problem that needs to be addressed
Projects
None yet
Development

No branches or pull requests

4 participants