Skip to content

Conversation

@HyukjinKwon
Copy link
Member

This PR adds some guides for testing individual PySpark tests, and also some information about PySpark coverage.

screen shot 2018-12-05 at 3 59 50 pm

See also apache/spark#23203 and SPARK-26252

@HyukjinKwon
Copy link
Member Author

cc @srowen and @cloud-fan

$ python/run-tests --testnames pyspark.sql.dataframe
```

Lastly, there is another script called `run-tests-with-coverage` in the same location, which generates coverage report for PySpark tests. It accepts same arguments with `run-tests`.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a great improvement. Thanks!
Do we need to mention pip install coverage here since this is an introductory doc? Or, is that too obvious?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be okay - the error message is quite clear (with red colour message haha .. ):

$ ./python/run-tests-with-coverage --python-executables=pypy
...
Coverage is not installed in Python executable 'pypy' but 'COVERAGE_PROCESS_START' environment variable is set, exiting.

@asfgit asfgit closed this in b037ddf Dec 6, 2018
@HyukjinKwon
Copy link
Member Author

Merged to asf-site!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants