-
Notifications
You must be signed in to change notification settings - Fork 2.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
target directory can grow without bound #10589
Comments
https://crates.io/crates/cargo-sweep is a third party crate that tries to provide more control over cleaning the target directory. See also #7150. One issue in particular that points to is #5026 which I suspect this issue would be considered a duplicate of. |
Yea, I'm going to close this as a duplicate of those issues. We've been looking at using a database to track cache directory contents and to be able to age out unused files. |
Problem
Empirically, my
target
directory for each project seems to grow continuously. If things are being deleted this is not happening for me, or not reliably, or not enough.Steps
Unfortunately I can't provide a sensible repro. It is easy to repro that the
target/
directory can contain old build objects, but that is expected. I don't have a reliable repro fortarget/
becoming obviously-overly-large, as this seems to happen gradually.My observations are along these lines:
--locked
. I'm not 100% sure, but I suspect this "giant target directory" problem can occur without me making changes to dependency versions, and, I think more confidently, without me making many such changes.Most recently I found that the
target
directory for the primary tree on my laptop of my personal project otter, which takes 6G for a clean build, had got to 114G. That would seem to imply that cargo had cached around 20 old versions of at least some build artifacts.Instead of providing a repro recipe, I could very easily wait for this to happen again and then provide some kind of summary or inventory of what is in the
target
, if someone would give me the right runes to type to extract relevant metatdata.A wrinkle about my workflow that may be unusual is that I do most of my builds "out of tree", ie using
--manifest-path
to point to source code elsewhere. (The source code is not writeable by the build user.) I have found that many Rust ecosystem packages' build and test scripts do not work properly out-of-tree, and then I resort to linkfarming, but this ever-growingtarget
seems to happen either way.I also notice that despite all this cacheing, changes to Rust compiler version or flags set in the config, cause a complete rebuild of everything, so if I need to work on a project which wants to test with multiple rustc versions, I end up doing a lot of rebuilding or having to use two working trees. But I think this latter is a separate issue.
Possible Solution(s)
Put a limit on the number of old versions of things to be kept. (Keep the newest, or use a biased random cache eviction algorith.)
Allow the user to provide a configuration option limiting the maximum size of a
target
directory, and when that is reached, do some kind of more aggressive cleanup.Notes
This has been happening to me for a some time, possibly "forever". I tend to run multiple Rust versions.
In the most recent case this happened, I think I was predominantly (if not entirely) using the Rust version (and therefore the cargo version) I quote below.
Thanks for your attention.
Version
The text was updated successfully, but these errors were encountered: