-
Notifications
You must be signed in to change notification settings - Fork 12.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reuse LTO products for incremental builds when deps are unchanged #71850
Comments
To be clear: You're describing a scenario where your current crate is changed, but the dependencies are unchanged, right? |
@pnkfelix Yep: changing some of the crate's code while deps are unchanged (LTO setting unchanged) takes a while to compile (incrementally). |
For fat LTO caching is not possible. Fat LTO basically works by merging bitcode for all codegen units into a single llvm module and then optimizing. For ThinLTO afaik we already reuses all optimized codegen units which LLVM tells us can be reused: rust/compiler/rustc_codegen_llvm/src/back/lto.rs Lines 549 to 558 in 33a2c24
|
Thanks for your clear and documented answer!
I see how caching is not possible. Then how about changing how FatLTO is done? Instead of merging+optimizing all in one step, shouldn't we be doing this with various subsets of the crates tree? I'm thinking walking the dependency tree from the bottom up, this way parts of the tree that weren't touched can be identified and reused. |
What is the benefit of that over ThinLTO?
You can still have optimizations that are affected by parts of the call graph that did change. That is the point of LTO. |
Incremental builds with LTO (
thin
orfat
) always take the same amount of time (~10s on my project) when dependencies are unchanged. It seems to take as long when adding a dependency.Is it not possible to cache (at least part) of the LTO computations?
Note I'm using
cross
:Note also this bug I've encountered WRT LTO: cross-rs/cross#416
#71248 is the most recent issue I could find that seems related.
The text was updated successfully, but these errors were encountered: