-
Notifications
You must be signed in to change notification settings - Fork 12.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cache accessibe symbol chains and serialized type parameter name generation #43973
Cache accessibe symbol chains and serialized type parameter name generation #43973
Conversation
src/compiler/checker.ts
Outdated
(context.typeParameterNamesByText || (context.typeParameterNamesByText = new Set())).add(result.escapedText as string); | ||
// avoiding iterations of the above loop turns out to be worth it when `i` starts to get large, so we cache the max | ||
// `i` we've used thus far, to save work later | ||
(context.typeParameterNamesByTextNextNameCount ||= new Map()).set(rawtext, i); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is the "cache serialized type parameter name generation" part - this additional tiny bit of caching brought the runtime of the example down from 7s to 5.5s on my machine.
const firstRelevantLocation = forEachSymbolTableInScope(enclosingDeclaration, (_, __, node) => node); | ||
const key = `${useOnlyExternalAliasing ? 0 : 1}|${firstRelevantLocation && getNodeId(firstRelevantLocation)}|${meaning}`; | ||
if (cache.has(key)) { | ||
return cache.get(key); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This cache could probably be better - intelligently sharing results for commonish-meaning
s and having a hierarchical cache both sound nice in theory, but are a pain to implement performantly (the last time I tried I ruined perf in the general case). This relatively simple cache was sufficient to bring the example runtime down from 20s to 7s on my machine, though is likely to only really effect really degenerate cases, like the one in the linked issue.
const preProgram = ts.length(rootFiles) < 100 ? ts.createProgram(rootFiles || [], { ...compilerOptions, configFile: compilerOptions.configFile, traceResolution: false }, host) : undefined; | ||
// pre-emit/post-emit error comparison requires declaration emit twice, which can be slow. If it's unlikely to flag any error consistency issues | ||
// and if the test is running `skipLibCheck` - an indicator that we want the tets to run quickly - skip the before/after error comparison, too | ||
const skipErrorComparison = ts.length(rootFiles) >= 100 || (!!compilerOptions.skipLibCheck && !!compilerOptions.declaration); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Declaration emit for this test still takes 5s on my machine. getPreEmitDiagnostics
invokes declaration emit to get declaration emit errors (for historical reasons they've been considered "preEmit" errors), meaning declaration emit had to run twice, so 10s. Single-threaded, this was fine, but in high resource contention scenarios (ie, runtests-parallel
) this was long enough to stretch to 40s and timeout. So I added an opt-out case to the preEmit/preEmit-after-emit error consistency check to keep this test runtime low (so it could pass without a timeout when running in parallel, and probably on CI).
@@ -0,0 +1,34 @@ | |||
// @declaration: true | |||
// @skipLibCheck: true | |||
// @noTypesAndSymbols: true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And I've skipped the type/symbol baselines for this test, since they double again the test runtime (and also thus cause a timeout) by essentially invoking the same logic as declaration emit for no real additional gain (compared to the declaration emit we're already running).
@typescript-bot pack this |
Heya @weswigham, I've started to run the tarball bundle task on this PR at 0f4e922. You can monitor the build here. |
Heya @weswigham, I've started to run the perf test suite on this PR at 0f4e922. You can monitor the build here. Update: The results are in! |
Hey @weswigham, I've packed this into an installable tgz. You can install it for testing by referencing it in your
and then running |
@@ -3935,12 +3935,12 @@ namespace ts { | |||
return typeCopy; | |||
} | |||
|
|||
function forEachSymbolTableInScope<T>(enclosingDeclaration: Node | undefined, callback: (symbolTable: SymbolTable) => T): T { | |||
function forEachSymbolTableInScope<T>(enclosingDeclaration: Node | undefined, callback: (symbolTable: SymbolTable, ignoreQualification?: boolean, scopeNode?: Node) => T): T { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I dont see anywhere we are passing ignoreQualification
true or false ?
Also can we make these parameter nonOptional but type | undefined instead to ensure every call back considers what needs to be passed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Inside this function, it's not; however the function we often pass as a callback uses it as a second parameter (and it's optional there) for its' other callsites, so this optional parameter is here so the callback parameter is compatible with both that usage and the node-finding usage.
@weswigham Here they are:Comparison Report - master..43973
System
Hosts
Scenarios
Developer Information: |
-3-5% check time in |
@typescript-bot pack this 52499b1 added another layer of caching, this one within |
Heya @weswigham, I've started to run the perf test suite on this PR at 909d519. You can monitor the build here. Update: The results are in! |
Heya @weswigham, I've started to run the tarball bundle task on this PR at 909d519. You can monitor the build here. |
Hey @weswigham, I've packed this into an installable tgz. You can install it for testing by referencing it in your
and then running |
@weswigham Here they are:Comparison Report - master..43973
System
Hosts
Scenarios
Developer Information: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@@ -4479,7 +4489,7 @@ namespace ts { | |||
enclosingDeclaration, | |||
flags: flags || NodeBuilderFlags.None, | |||
// If no full tracker is provided, fake up a dummy one with a basic limited-functionality moduleResolverHost | |||
tracker: tracker && tracker.trackSymbol ? tracker : { trackSymbol: noop, moduleResolverHost: flags! & NodeBuilderFlags.DoNotIncludeSymbolChain ? { | |||
tracker: tracker && tracker.trackSymbol ? tracker : { trackSymbol: () => false, moduleResolverHost: flags! & NodeBuilderFlags.DoNotIncludeSymbolChain ? { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There exists a ts.returnFalse
🙃
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Eh, while I understand we use noop
to reduce memory load, this fallback tracker (along with the other host objects I adjusted) is constructed infrequently enough that it doesn't really matter, I think.
Which can improve declaration emit performance significantly in some edge cases. Together, these caches bring the example in the linked issue down from 20s to 5.5s on my machine, which is a pretty marked improvement.
Fixes #42937