-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Perf] Windows/x64: Regressions in System.Text.RegularExpressions.Tests.Perf_Regex_Common 2/28/2024 9:23:39 AM #99318
Comments
Run Information
Regressions in System.Text.RegularExpressions.Tests.Perf_Regex_Common
ReproGeneral Docs link: https://github.com/dotnet/performance/blob/main/docs/benchmarking-workflow-dotnet-runtime.md git clone https://github.com/dotnet/performance.git
py .\performance\scripts\benchmarks_ci.py -f net8.0 --filter 'System.Text.RegularExpressions.Tests.Perf_Regex_Common*' PayloadsSystem.Text.RegularExpressions.Tests.Perf_Regex_Common.Backtracking(Options: Compiled)ETL FilesHistogramJIT DisasmsDocsProfiling workflow for dotnet/runtime repository Run Information
Regressions in Benchstone.BenchF.Whetsto
ReproGeneral Docs link: https://github.com/dotnet/performance/blob/main/docs/benchmarking-workflow-dotnet-runtime.md git clone https://github.com/dotnet/performance.git
py .\performance\scripts\benchmarks_ci.py -f net8.0 --filter 'Benchstone.BenchF.Whetsto*' PayloadsBenchstone.BenchF.Whetsto.TestETL FilesHistogramJIT DisasmsDocsProfiling workflow for dotnet/runtime repository |
This is the range of commits, but there is nothing that seems to jump out, but we are seeing this regression across all of our configurations. |
perhaps there is more than 1 reason for these regressions. given there is a Linq regression, I think that this change should be considered: e101ae2 |
Tagging subscribers to this area: @dotnet/area-system-linq Issue DetailsRun Information
Regressions in System.Linq.Tests.Perf_Enumerable
ReproGeneral Docs link: https://github.com/dotnet/performance/blob/main/docs/benchmarking-workflow-dotnet-runtime.md git clone https://github.com/dotnet/performance.git
py .\performance\scripts\benchmarks_ci.py -f net8.0 --filter 'System.Linq.Tests.Perf_Enumerable*' PayloadsSystem.Linq.Tests.Perf_Enumerable.ElementAt(input: IList)ETL FilesHistogramJIT DisasmsDocsProfiling workflow for dotnet/runtime repository
|
This could be related to the ElementAt test, and I can take a look at that one to see if I can repro. I don't think it could be related to the other two. |
Presumably fixed by #99437 |
@eiriktsarpalis, what about the other tests? |
I hadn't noticed that more regressions had been appended by the bot as a comment. Is that common? |
All tests look to be back in normal ranges. |
Run Information
Regressions in System.Linq.Tests.Perf_Enumerable
Test Report
Repro
General Docs link: https://github.com/dotnet/performance/blob/main/docs/benchmarking-workflow-dotnet-runtime.md
Payloads
Baseline
Compare
System.Linq.Tests.Perf_Enumerable.ElementAt(input: IList)
ETL Files
Histogram
JIT Disasms
Docs
Profiling workflow for dotnet/runtime repository
Benchmarking workflow for dotnet/runtime repository
The text was updated successfully, but these errors were encountered: