Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Enhancement] Allow configuration of directory where coverage loads and saves files #752

Closed
gtristan opened this issue Jan 8, 2019 · 4 comments
Labels
enhancement New feature or request

Comments

@gtristan
Copy link

gtristan commented Jan 8, 2019

Is your feature request related to a problem? Please describe.

I would like to run my tests multiple times in parallel, against multiple python environments, and safely collect coverage reports for each test run separately.

The program under test is using multiprocessing already, which means coverage will create many files for each pytest run already, these should be done in a separate directory.

Describe the solution you'd like

It would be nice to just have a command line option or configuration or environment variable to specify a directory where coverage should read and write to/from.

Describe alternatives you've considered

Coverage could automatically create a temp directory at the beginning of the run, and associate the current process tree under test with that directory internally, cleaning it up at the end and either appending or overwriting the user specified target data file.

@gtristan gtristan added the enhancement New feature or request label Jan 8, 2019
@nedbat
Copy link
Owner

nedbat commented Jan 8, 2019

Have you tried the --parallel option? It adds uniquifying information to the data file name so that tests run in parallel won't overwrite each others files.

If you still need a separate directory, I believe you can set the data file (for example with the COVERAGE_FILE environment variable, or the [run] data_file setting in the rc file) with a directory included.

@gtristan
Copy link
Author

gtristan commented Jan 8, 2019

@nedbat Thanks for the quick reply !

I tried the parallel option but the result was that I got a .coverage file - and I suspect it is because the program under test uses parallelism = multiprocessing already; at the end of the test session it seems they are collated into .coverage regardless.

If only changing the COVERAGE_FILE to make them unique is good enough, then that would be fine for me. My concern here is that if I run the suite multiple times in parallel and multiple coverage instances are collating results of multiprocess tests... then uniquely named coverage files from one run will accidentally be collected by adjacent runs - I thought the easiest way to ensure this didnt happen is to just have a separate "work directory" per run.

@nedbat
Copy link
Owner

nedbat commented Jan 8, 2019

Can you provide some details about how you are running your tests? --parallel should have done what you needed.

My point about COVERAGE_FILE was that you could do: COVERAGE_FILE=/tmp/foo1234/.coverage and use a different directory for each test run.

@gtristan
Copy link
Author

gtristan commented Jan 8, 2019

I had tried this by enabling parallel in the .coveragerc here

I had not considered changing CWD for the test, this appears to work perfectly thanks !

EDIT:

To elaborate on the solution with tox, I basically set changedir = {envdir} and then specify commands = pytest ... {posargs} {toxinidir}, this by itself makes all coverage recording exclusive for a given run.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants