-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consider caching Utf8JsonWriter and PooledByteBufferWriter on a thread static in JsonSerializer.Serialize #69889
Comments
I couldn't figure out the best area label to add to this issue. If you have write-permissions please help me learn by adding exactly one area label. |
Tagging subscribers to this area: @dotnet/area-system-text-json, @gregsdennis Issue DetailsDescriptionLooking into the allocation profile for our JSON tech empower benchmark, the allocation overhead for the JsonSerializer itself ConfigurationRun this crank command.
This will run the JsonHttps benchmark and it'll spit out an allocation profile that you can view in perfview (alloc tick events). Regression?No it's not Data
AnalysisWe have more infrastructure allocation overhead than we do serializing the actual object.
|
I'm running this performance test after this change and I'm not seeing a big difference 🤔. Command:
|
Could it be that the benchmark uses one of the async methods? No attempt to cache is being made there. |
Why not? I missed that in the PR... (well I know why its not thread local for that case 😄).. |
I figured it might not be as important given the added overhead of state machines. The benefits in the sync case seem fairly modest and only register for trivially sized object graphs (i.e., those of depth < 2) |
That's pretty reasonable. I think in the benchmark that could be synchronous because the object is tiny and will fit in the buffer. It's more problematic in general for 2 reasons:
|
Description
Looking into the allocation profile for our JSON tech empower benchmark, the allocation overhead for the JsonSerializer itself
Configuration
Run this crank command.
This will run the JsonHttps benchmark and it'll spit out an allocation profile that you can view in perfview (alloc tick events).
Regression?
No it's not
Data
Analysis
We have more infrastructure allocation overhead than we do serializing the actual object.
The text was updated successfully, but these errors were encountered: