-
Notifications
You must be signed in to change notification settings - Fork 30k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
performance regression #26229
Comments
If necessary, we can do a quick revert. |
Running the module benchmarks with $ node benchmark/compare.js --old ./node-8375c706 --new ./node-d345b0d --set n=500 module > compare.csv
[00:04:16|% 100| 1/1 files | 60/60 runs | 4/4 configs]: Done
$ cat compare.csv | Rscript benchmark/compare.R
confidence improvement accuracy (*) (**) (***)
module/module-loader.js useCache='false' fullPath='false' n=500 2.30 % ±7.01% ±9.36% ±12.22%
module/module-loader.js useCache='false' fullPath='true' n=500 ** 11.13 % ±6.60% ±8.79% ±11.47%
module/module-loader.js useCache='true' fullPath='false' n=500 10.26 % ±17.67% ±23.51% ±30.60%
module/module-loader.js useCache='true' fullPath='true' n=500 *** -38.15 % ±10.97% ±14.60% ±19.01%
Be aware that when doing many comparisons the risk of a false-positive
result increases. In this case there are 4 comparisons, you can thus
expect the following amount of false-positive results:
0.20 false positives, when considering a 5% risk acceptance (*, **, ***),
0.04 false positives, when considering a 1% risk acceptance (**, ***),
0.00 false positives, when considering a 0.1% risk acceptance (***)
$ |
I'm seeing a regression (to a lesser extent) in my own non-Node core use related to |
TL;DR: V8 uses a compilation cache to speed up parsing/compiling the same source string. This cache is only keyed on the source string, and transparent to the embedder. We cannot use this cache for Note that this compilation cache is not to be confused with the code cache that is stored on disk and needs to be managed by the embedder. This particular test benefits a lot from the compilation cache because it compiles the same script over and over again. With a pre-regression build I get this (reducing $ time out/Release/node test/pummel/test-crypto-timing-safe-equal-benchmarks.js
real 0m1.702s
user 0m1.985s
sys 0m0.401s
$ time out/Release/node --no-compilation-cache test/pummel/test-crypto-timing-safe-equal-benchmarks.js
real 0m4.488s
user 0m4.742s
sys 0m0.564s I don't think what this test represents real-world conditions, so how about we just fix this test to no longer time out? |
If we decide that's the approach to take, we'll probably also want to adjust the benchmark too? (See #26229 (comment).) |
Making the test not time out but still be reliable may be rather challenging. See #8744. If we decide this is the way to go, perhaps we can split the test file into two tests: One that runs the timing-safe test and one that runs the not-timing-safe Buffer stuff as the smoke test. |
Fix (hopefully) for the test #26237? |
just to sum this up, it's only a regression for people who depended on the behaviour of V8 to cache the compilation of source text when they repeatedly cleared the require cache? if that's correct, this regression seems absolutely fine to me. obviously if there's some way to fix it we should, but I don't think this is a serious problem. |
Yup. That's why I don't think we need to fix anything in the implementation. |
If this is a so-called nothingburger, perhaps we should fix the benchmark to not report this as a problem? I'm not sure if that means removing the one variant in the benchmark or if it would mean adjusting the benchmark. See #26229 (comment). |
I'm going to close this but feel free to re-open or comment if anyone thinks that's premature. Thanks, everyone. |
Resetting require.cache() to `Object.create(null)` each time rather than deleting the specific key results in a 10x improvement in running time. Fixes: nodejs#25984 Refs: nodejs#26229
Using `eval()` rather than `require()`'ing a fixture and deleting it from the cache results in a roughtly 10x improvement in running time. Fixes: nodejs#25984 Refs: nodejs#26229 PR-URL: nodejs#26237 Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com> Reviewed-By: Colin Ihrig <cjihrig@gmail.com> Reviewed-By: Anna Henningsen <anna@addaleax.net> Reviewed-By: Yang Guo <yangguo@chromium.org> Reviewed-By: Ruben Bridgewater <ruben@bridgewater.de> Reviewed-By: Teddy Katz <teddy.katz@gmail.com> Reviewed-By: Richard Lau <riclau@uk.ibm.com>
Using `eval()` rather than `require()`'ing a fixture and deleting it from the cache results in a roughtly 10x improvement in running time. Fixes: #25984 Refs: #26229 PR-URL: #26237 Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com> Reviewed-By: Colin Ihrig <cjihrig@gmail.com> Reviewed-By: Anna Henningsen <anna@addaleax.net> Reviewed-By: Yang Guo <yangguo@chromium.org> Reviewed-By: Ruben Bridgewater <ruben@bridgewater.de> Reviewed-By: Teddy Katz <teddy.katz@gmail.com> Reviewed-By: Richard Lau <riclau@uk.ibm.com>
Using `eval()` rather than `require()`'ing a fixture and deleting it from the cache results in a roughtly 10x improvement in running time. Fixes: #25984 Refs: #26229 PR-URL: #26237 Reviewed-By: Joyee Cheung <joyeec9h3@gmail.com> Reviewed-By: Colin Ihrig <cjihrig@gmail.com> Reviewed-By: Anna Henningsen <anna@addaleax.net> Reviewed-By: Yang Guo <yangguo@chromium.org> Reviewed-By: Ruben Bridgewater <ruben@bridgewater.de> Reviewed-By: Teddy Katz <teddy.katz@gmail.com> Reviewed-By: Richard Lau <riclau@uk.ibm.com>
@hashseed I realize this is an old comment, but is this something that we can control from JS? "Embedder" sounds like a lower level than Using Does it relate to #24069? I can of course open up a new issue if that's better 👍 |
Unfortunately, it appears that one of the commits in #21573 (d345b0d) has resulted in a significant performance regression.
test/pummel/test-crypto-timing-safe-equal-benchmarks.js
now takes about twice as long to run and is timing out in the nightly CI. This may be an edge-case thing that we don't care about, but I don't actually know.Refs: #26216
/ping @ryzokuken
The text was updated successfully, but these errors were encountered: