Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Documentation for multiple tags #1

Open
f-f opened this issue Sep 1, 2019 · 3 comments
Open

Documentation for multiple tags #1

f-f opened this issue Sep 1, 2019 · 3 comments

Comments

@f-f
Copy link
Member

f-f commented Sep 1, 2019

Right now we generate documentation for a single tag in CI

In doing this we can just use the Travis integration with github pages, to "replace" the current content

Instead we should use an "additive" approach where for every tag we add a folder with the tag name. I suspect that for doing this we won't be able to use the Travis integration, but we'd have to manually commit and push to the gh-pages branch

@klntsky
Copy link
Member

klntsky commented Sep 1, 2019

We could git clone older static files from gh-pages branch, put the docs for a new tag near them, and push again using Travis integration.

Optionally we could delete the directory corresponding to the oldest tag, to keep the size of the repo within bounds.

@f-f
Copy link
Member Author

f-f commented Sep 1, 2019

@klntsky

We could git clone older static files from gh-pages branch, put the docs for a new tag near them, and push again using Travis integration

Though I think in CI we already have the repo cloned? If this holds then it'd be just about checking out the gh-pages branch to its latest commit and shuffling files around

Optionally we could delete the directory corresponding to the oldest tag, to keep the size of the repo within bounds

As far as I know this would require manipulating git history, since even if you delete the content it stays recorded there and affects repo size.

I would say we should not try to optimize for size for now, because we can migrate to something else (e.g. S3 or just a whatever cheap VPS) if this gets out of hand.
GitHub has a soft limit of 1GB on repo sizes (and 100GB hard limit), so we have plenty of time to figure out if we have to migrate - I'm discouraging optimization because the cost of a VPS totally dwarfs the cost of engineering time to optimize this stuff

@klntsky
Copy link
Member

klntsky commented Sep 1, 2019

As far as I know this would require manipulating git history, since even if you delete the content it stays recorded there and affects repo size.

We have keep_history: true in .travis.yml as for now, but this could be changed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants