Replies: 2 comments 1 reply
-
Hi @Spectre5, I like the idea, larger pushes can get messy with updated documentation and can make reviewing PRs tricky! Not sure how much work would be involved with the full automation and generation of examples with code updates? At this stage I'm not sure how this will work with the readthedocs implementation, which seems to rely on everything being in the same repository (e.g. sphinx generates docs from the python files, documentation is rebuilt on branch pushes and releases etc.) Keen to investigate moving the documentation, or if there is a way to maybe move only the binary files?? Robbie |
Beta Was this translation helpful? Give feedback.
-
I did make this fully "automatic" in my big PR I opened - it is pretty slick if you get it setup working right (mainly just get Poetry installed, at least on Linux). That PR is setup so that only the source documentation would be in the repo and the build artifacts would not be. I did not explore exactly how it would work with readthedocs, although I suspect it could be made to work based on a short e-mail exchange I had with someone at readthedocs. That said, another option would be to keep it even more "in house" and use GitHub Pages instead, which is what I was kind of learning towards. That said, that whole PR was maybe not the best timing as there is a desire to get the shapely stuff merged first, which admittedly has a bigger impact on the end users. But it did take quite a long time to make that PR and I'm not sure that I'll have the time to fix it all again after Shapely is merged. Certain parts or ideas of it may still get used though. |
Beta Was this translation helpful? Give feedback.
-
I originally posted this in #54, but it is really better placed here for general discussion
It is generally best practice to not store the documentation build in the repository, only the docs source. This is especially true of binary data (e.g. images) which cannot be diff'd with git. This is even more true if you re-create the images/build regularly, like with every push to master or every library release. Otherwise, you bloat the git repository with lots of binary data, which isn't really what git is good for. Other projects have various ways of dealing with this, usually pushing the documentation build artifacts directly to some 3rd party service.
I took a closer look around at the pyvista repository last night. What they do in their ci file is automatically delete their dummy examples and docs repositories and then push the latest contents to them, with all examples re-run and images updated for the doc. They use notebooks to accomplish this, but you could probably do it without too. I like this idea as it keeps only source in your main repository. Further, their documentation website is actually hosted by github pages and is directly served from that pyvista-docs repository. For anyone unaware, you can use custom domain names to server github pages, see here for some details.
I know that section-properties currently uses
readthedocs
, but I'm sure we could do something similar where we push the documentation to a second "docs" repository and then feed that toreadthedocs
. I'm not familiar with pushing toreadthedocs
myself, but it might even be possible to push the built documentation directly toreadthedocs
from the CI pipeline and not need a separate repo for those build artifacts.Beta Was this translation helpful? Give feedback.
All reactions