You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Caching is a issue often talked about. It would save a lot of resources if dependencies for projects are not built every time, but only run from a certain point on.
Suggested solution
There is already a feature request in #758 that might try to achieve something similar, but it sounds focused on results, not prepare steps.
My proposal is the following: For every pipeline stage, there is an optional "refresh: " parameter, that defines how often this needs to be rebuilt. For example, when dependencies change frequently, you can specifiy "refresh: 1d". If your prepare step is used seldom, you can rebuilt it every 14d.
If you install some system requirements (e.g. git, nodejs, your compiler, ...), you could specifiy to rebuild it every 30d, but the next pipeline that pulls your specific dependencies (npm, go packages etc) is refreshed every 12h.
The container state that is created at this point is than cached for the maximum specified time, maybe by simply tagging the docker image, just like Docker builds are cached when the context and the Dockerfile doesn't change.
The big question is how the clone is supposed to work there. Probably by defining prepare steps and clone the project afterwards. Or clone into a mount at the beginning and keep it even with different containers.
I find myself having to create special-purpose Docker containers with base dependencies for CI to run faster.
I've always felt a wish that I could just keep the instructions written in the CI YAML file but have the stages 'cached'.
There is another wish to have a way to cache package downloads/compilations (for e.g. Cargo), since that takes a long time on my CI and outweighs the build time of my project (also wasting energy/time/...). I tried setting up MinIO and sccache but it was pretty clunky and didn't work that well. I wish there was a nicely-integrated built-in method.
Clear and concise description of the problem
Caching is a issue often talked about. It would save a lot of resources if dependencies for projects are not built every time, but only run from a certain point on.
Suggested solution
There is already a feature request in #758 that might try to achieve something similar, but it sounds focused on results, not prepare steps.
My proposal is the following: For every pipeline stage, there is an optional "refresh: " parameter, that defines how often this needs to be rebuilt. For example, when dependencies change frequently, you can specifiy "refresh: 1d". If your prepare step is used seldom, you can rebuilt it every 14d.
If you install some system requirements (e.g. git, nodejs, your compiler, ...), you could specifiy to rebuild it every 30d, but the next pipeline that pulls your specific dependencies (npm, go packages etc) is refreshed every 12h.
The container state that is created at this point is than cached for the maximum specified time, maybe by simply tagging the docker image, just like Docker builds are cached when the context and the Dockerfile doesn't change.
The big question is how the clone is supposed to work there. Probably by defining prepare steps and clone the project afterwards. Or clone into a mount at the beginning and keep it even with different containers.
Alternative
The
Additional context
No response
Validations
The text was updated successfully, but these errors were encountered: