-
-
Notifications
You must be signed in to change notification settings - Fork 18.5k
make docs reflect API usage not the fully qualified name #4539
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
why not do both so we don't invalidate old links? |
not exactly sure what you mean...make it appear as above and then have a shadowed link? we would have to do this for every object in the API, not sure if there's an automated way to do this... |
Maybe we could try using autodoc extension (if we're not already doing it)? I thought sphinx actually compiled and executed modules to determine elements - so I'd imagine it could handle documenting at both the top level namespace and in the subpackages - how would it know the difference? e.g. .. autodoc:: pandas
:members:
:exclude-members: np, datetime, util, algos, io, version ( |
The API usage docs are taken from the |
Right, so maybe you could have two copies of that file? (one for toplevel That said, not terrible to break some links. |
@cpcloud on a related note - is there a way to make sphinx exit non-zero if there are errors during the doc build? it'd be nice to add the doc building to Travis, but it looks like it looks like it doesn't actually exit non-zero if there are build errors... |
@cpcloud good call. wish it had an option to keep building and then error out at the end though (then you could see alll the build errors...) |
yeah i think failing fast tho is good there then you don't have to wait for everything |
@cpcloud I tried playing around with it. passing |
Well that's annoying....hmm i might ask on SO if I can come up with a reasonable minimal example....sphinx-build requires quite a bit of infrastructure... OTOH maybe i can just say check out the pandas repo....although i anticipate many many downvotes if ido that |
@cpcloud here's a minimal example:
Should fail because nampy doesn't exist. Or with a doctest
We would want it to exit 1 because the example would fail |
and all this time i thought it was called |
@cpcloud you might want to double check that it doesn't fail for you before asking on SO. I think I didn't see it, but I'm not sure. Don't have time to set up sphinx to check this on my own computer right now tho :P |
I think erroring out here is probably just making the embedded ipython shell exit when it raises...going to try it |
@jtratner Figured this out. Turns out it's the ipython directive that needs changing. I've got a PR in the works. You have to collect the traceback output in the |
@cpcloud one thing though - it needs to not cease building if there are errors anywhere (so you can still build local docs if you don't have R installed or something like that). |
pushing to "someday" not important enough to warrant spending a lot of time on |
@cpcloud @jtratner What is wrong with the suggestion @cpcloud made somewhere in the beginning of this discussion to just change all This is a rather easy change (I could do a PR) to solve this issue. And if we are worried with broken links, I can ensure that the old pages are still there (by keeping how it is now done in api.rst in a comment -> pages are generated, but it will not appear in the html page of api.rst). |
@jorisvandenbossche if you can not break old links it sounds very good to me. Heck, even if it does break old links it's probably better. Can you post an example of how it will look? |
@jtratner It will look like eg the |
Sounds good - I'm sold. |
DOC: make api docs reflect api usage (#4539)
minor issue, but would be nice for consistency
http://pandas.pydata.org/pandas-docs/stable/generated/pandas.io.parsers.read_table.html#pandas.io.parsers.read_table
shows it as
pandas.io.parsers.read_table
but it should be used aspandas.read_table
and that's how it should appearthis could be a simple sphinx reference label change
FWIW this would probably invalidate a lot of links on SO
The text was updated successfully, but these errors were encountered: