Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add option to only run short doctests #25270

Closed
saraedum opened this issue Apr 30, 2018 · 87 comments
Closed

Add option to only run short doctests #25270

saraedum opened this issue Apr 30, 2018 · 87 comments

Comments

@saraedum
Copy link
Member

The attached picture shows how many modules (horizontal) you can run in how many CPU minutes (vertical). So, for example, in 40 CPU minutes (i.e., 10 wall minutes on my laptop) I can run all the doctests in 80% of all Sage modules. In 4 CPU minutes, I can doctest 50% of all Sage modules.

This ticket implements a --short [SECONDS] switch which runs as many doctests as possible (from the top of each module) in SECONDS (default 300).

This gives us a quick way to check that a change did not break things too terribly. For example, when I work on p-adics, I would probably do: sage -tp --short 30 src/sage/rings/padics frequently. Once I am happy with my changes, I could do sage -tp --short src/sage/rings and a final sage -tp --short --all before I leave the rest to the patchbot.

Here is an example:

> sage -tp --short 5 src/sage/rings/padics/*.py
Doctesting 19 files using 4 threads.
sage -t src/sage/rings/padics/padic_base_leaves.py
    [14% of tests run, 1.31 s]
sage -t src/sage/rings/padics/factory.py
    [29% of tests run, 1.31 s]
sage -t src/sage/rings/padics/lattice_precision.py
    [43% of tests run, 1.00 s]
sage -t src/sage/rings/padics/padic_valuation.py
    [4% of tests run, 2.80 s]
sage -t src/sage/rings/padics/padic_base_generic.py
    [95% of tests run, 1.72 s]
sage -t src/sage/rings/padics/local_generic.py
    [1% of tests run, 0.73 s]
sage -t src/sage/rings/padics/padic_extension_generic.py
    [5% of tests run, 0.77 s]
sage -t src/sage/rings/padics/padic_lattice_element.py
    [2% of tests run, 0.78 s]
sage -t src/sage/rings/padics/generic_nodes.py
    [1% of tests run, 0.75 s]
sage -t src/sage/rings/padics/padic_generic.py
    [1% of tests run, 0.73 s]
sage -t src/sage/rings/padics/unramified_extension_generic.py
    [2% of tests run, 0.43 s]
sage -t src/sage/rings/padics/tutorial.py
    [4% of tests run, 0.70 s]
sage -t src/sage/rings/padics/tests.py
    [0% of tests run, 0.00 s]
sage -t src/sage/rings/padics/eisenstein_extension_generic.py
    [7% of tests run, 0.80 s]
sage -t src/sage/rings/padics/misc.py
    [6% of tests run, 0.78 s]
sage -t src/sage/rings/padics/precision_error.py
    [0 tests, 0.00 s]
sage -t src/sage/rings/padics/all.py
    [0 tests, 0.00 s]
sage -t src/sage/rings/padics/__init__.py
    [0 tests, 0.00 s]
sage -t src/sage/rings/padics/padic_extension_leaves.py
    [10% of tests run, 6.54 s]
----------------------------------------------------------------------
All tests passed!
----------------------------------------------------------------------
Total time for all tests: 6.6 seconds
    cpu time: 20.0 seconds
    cumulative wall time: 21.1 seconds

CC: @embray @nthiery @roed314

Component: doctest framework

Work Issues: is the patchbot happy?

Author: Julian Rüth

Branch: a4614ab

Reviewer: Erik Bray

Issue created by migration from https://trac.sagemath.org/ticket/25270

@saraedum
Copy link
Member Author

comment:1

Attachment: bar.png

@saraedum

This comment has been minimized.

@saraedum saraedum changed the title Add option to only run --short doctests Add option to only run short doctests Apr 30, 2018
@saraedum
Copy link
Member Author

saraedum commented May 1, 2018

comment:4

I have a first prototype now. If I set it to try to finish in 30s walltime and run on src/sage/rings/padics, it does skip tests in only 4 files (here reported as failures at the moment):

sage -t ../rings/padics/padic_base_leaves.py  # 125 doctests failed
sage -t ../rings/padics/padic_lattice_element.py  # 303 doctests failed
sage -t ../rings/padics/padic_extension_leaves.py  # 72 doctests failed
sage -t ../rings/padics/padic_generic_element.pyx  # 172 doctests failed
----------------------------------------------------------------------
Total time for all tests: 35.8 seconds

As an extreme example, if let it test everything in src/ in two minutes, it manages to finish within four minutes (because it does not actually abort any doctest line once it started) and runs 314607 docstring lines and skips 361692, so it skips roughly 50%.

@saraedum
Copy link
Member Author

saraedum commented Jun 9, 2018

Branch: u/saraedum/25270

@saraedum
Copy link
Member Author

saraedum commented Jun 9, 2018

Author: Julian Rüth

@saraedum
Copy link
Member Author

saraedum commented Jun 9, 2018

Commit: d0f98ba

@saraedum
Copy link
Member Author

saraedum commented Jun 9, 2018

New commits:

d0f98baImplement --short switch

@saraedum

This comment has been minimized.

@saraedum

This comment has been minimized.

@saraedum

This comment has been minimized.

@saraedum
Copy link
Member Author

saraedum commented Jun 9, 2018

Work Issues: needs doctests

@saraedum
Copy link
Member Author

saraedum commented Jun 9, 2018

comment:9

I still need some doctests for this but if somebody already wants to comment on this, I would be very happy about some feedback.

Somehow my counting is off quite a bit. I am not entirely sure what's the problem.

@saraedum

This comment has been minimized.

@saraedum
Copy link
Member Author

Changed work issues from needs doctests to needs doctests, is the patchbot happy?

@saraedum
Copy link
Member Author

comment:12

I am quite unhappy with the spaghetti code that I added here but I don't really see a much better way. I have a feeling that we cramped a lot of features already into Python's doctesting framework and that it was never meant to be extended like that. A proper plugin system would have been cool but I guess it's too late for that ;)

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Jun 11, 2018

Changed commit from d0f98ba to 4c8639d

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Jun 11, 2018

Branch pushed to git repo; I updated commit sha1. New commits:

4c8639dReport relative number of tests run in --short runs

@embray
Copy link
Contributor

embray commented Jun 27, 2018

comment:14

Needs to be rebased it looks like. I can't wait to see this working though, and if I have some ideas how to improve the implementation I'll give them.

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Jul 3, 2018

Changed commit from 4c8639d to 915d8e6

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Jul 3, 2018

Branch pushed to git repo; I updated commit sha1. New commits:

915d8e6Merge develop into 25270

@saraedum
Copy link
Member Author

saraedum commented Jul 3, 2018

comment:16

Ok. I rebased this.

@embray
Copy link
Contributor

embray commented Jul 10, 2018

comment:17

I wish we also had a way to prioritize tests. E.g. we could mark a test as # important and ensure all such tests get run first. That could be added later though.

@embray
Copy link
Contributor

embray commented Jul 10, 2018

Reviewer: Erik Bray

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Aug 26, 2018

Changed commit from 3772312 to cb73e0c

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Aug 26, 2018

Branch pushed to git repo; I updated commit sha1. New commits:

cb73e0cMerge develop and 25270

@saraedum
Copy link
Member Author

saraedum commented Sep 2, 2018

comment:56

embray: I sometimes get

Error in atexit._run_exitfuncs:
Traceback (most recent call last):
  File "/home/sage/sage/local/lib/python2.7/atexit.py", line 24, in _run_exitfuncs
    func(*targs, **kargs)
  File "<doctest sage.cpython.atexit[2]>", line 2, in handler
NameError: global name 'print' is not defined
sage -t sage/src/sage/cpython/atexit.pyx
    NameError in doctesting framework
**********************************************************************
Tests run before doctest exception:
sage: import atexit ## line 54 ##
sage: from sage.cpython.atexit import restore_atexit ## line 55 ##
sage: def handler(*args, **kwargs):
    print((args, kwargs)) ## line 56 ##
sage: atexit.register(handler, 1, 2, c=3) ## line 58 ##
<function handler at 0x7f3ee18f41b8>

**********************************************************************
Traceback (most recent call last):
  File "/home/sage/sage/local/lib/python2.7/site-packages/sage/doctest/forker.py", line 2426, in __call__
    break
  File "sage/cpython/atexit.pyx", line 143, in sage.cpython.atexit.restore_atexit.__exit__ (build/cythonized/sage/cpython/atexit.c:1497)
    atexit._run_exitfuncs()
  File "/home/sage/sage/local/lib/python2.7/atexit.py", line 24, in _run_exitfuncs
    func(*targs, **kargs)
  File "<doctest sage.cpython.atexit[2]>", line 2, in handler
NameError: global name 'print' is not defined

I guess this happens when the tests in atexit are only run partially and maybe some cleanup that is part of the doctests did not happen.

Since you're listed as the author of that file, any clue what's going on here?

@embray
Copy link
Contributor

embray commented Sep 3, 2018

comment:57

Hmm--I think maybe, if nothing else, there should be a from future import print_function in that module, which there currently is not. I'm not positive that that's the problem though. I'd have to try to reproduce it first.

@embray
Copy link
Contributor

embray commented Sep 3, 2018

comment:58

Strange, though, since looking at sage.doctest.forker all doctests are supposed to be compiled as though from future import print_function was present. Maybe, somehow, just including the required compile flags isn't enough to ensure that the print() function is in the globals?

@embray
Copy link
Contributor

embray commented Sep 5, 2018

comment:59

For the atexit tests, does that happen if you test src/sage/cpython/atexit.pyx directly? Or does it only happen when running the full test suite with ./sage --short?

@embray embray modified the milestones: sage-8.3, sage-8.4 Sep 6, 2018
@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Oct 8, 2018

Changed commit from cb73e0c to b5b7b51

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Oct 8, 2018

Branch pushed to git repo; I updated commit sha1. New commits:

b5b7b51Do not use print function in atexit

@saraedum
Copy link
Member Author

saraedum commented Oct 8, 2018

comment:62

embray, I don't think we should put too much effort into fixing this rather obscure problem as it is going to go away with Python 3 anyway. I added a simple workaround (untested.) Would you be fine with something like this?


New commits:

b5b7b51Do not use print function in atexit

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Oct 8, 2018

Branch pushed to git repo; I updated commit sha1. New commits:

a4614abMerge remote-tracking branch 'origin/develop' into 25270

@sagetrac-git
Copy link
Mannequin

sagetrac-git mannequin commented Oct 8, 2018

Changed commit from b5b7b51 to a4614ab

@saraedum
Copy link
Member Author

saraedum commented Oct 8, 2018

Changed work issues from is the patchbot happy ⇒ positive review to is the patchbot happy?

@embray
Copy link
Contributor

embray commented Oct 8, 2018

comment:65

Yes, I think this is entirely reasonable. Relying on module-level globals that may already be set to None during interpreter shutdown can always be a problem. Here instead you were getting a NameError which is a bit more strange to me, but almost certainly has some strange interplay with how from future import print_function works.

Positive review from me pending patchbot results, but I think it will be fine...

@vbraun
Copy link
Member

vbraun commented Oct 20, 2018

Changed branch from u/saraedum/25270 to a4614ab

@embray
Copy link
Contributor

embray commented Oct 28, 2018

comment:67

I assume?

@embray
Copy link
Contributor

embray commented Oct 28, 2018

Changed commit from a4614ab to none

@embray embray modified the milestones: sage-8.4, sage-8.5 Oct 28, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants