-
Notifications
You must be signed in to change notification settings - Fork 4.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement a heuristic for using fast-path serialization in streaming JsonSerializer methods. #78646
Implement a heuristic for using fast-path serialization in streaming JsonSerializer methods. #78646
Conversation
Tagging subscribers to this area: @dotnet/area-system-text-json, @gregsdennis Issue DetailsThis PR refactors the root-level serialization methods so that source generated fast-path serialization delegates can be utilized in more places, including
This is to prevent excessive buffering since the fast path serializer doesn't support streaming.
|
src/libraries/System.Text.Json/tests/System.Text.Json.Tests/Serialization/Stream.WriteTests.cs
Show resolved
Hide resolved
src/libraries/System.Text.Json/tests/System.Text.Json.Tests/Serialization/Stream.WriteTests.cs
Show resolved
Hide resolved
...libraries/System.Text.Json/src/System/Text/Json/Serialization/JsonSerializer.Write.Stream.cs
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor questions but LGTM overrall.
...ibraries/System.Text.Json/src/System/Text/Json/Serialization/JsonSerializer.Write.Helpers.cs
Show resolved
Hide resolved
...System.Text.Json/src/System/Text/Json/Serialization/Metadata/JsonTypeInfoOfT.WriteHelpers.cs
Show resolved
Hide resolved
...System.Text.Json/src/System/Text/Json/Serialization/Metadata/JsonTypeInfoOfT.WriteHelpers.cs
Show resolved
Hide resolved
We just merged dotnet/performance#2740 adding source gen benchmarks to |
5d53577
to
b77139c
Compare
...System.Text.Json/src/System/Text/Json/Serialization/Metadata/JsonTypeInfoOfT.WriteHelpers.cs
Outdated
Show resolved
Hide resolved
...System.Text.Json/src/System/Text/Json/Serialization/Metadata/JsonTypeInfoOfT.WriteHelpers.cs
Show resolved
Hide resolved
…JsonSerializer methods.
…erialization for consistency.
b77139c
to
7546782
Compare
...System.Text.Json/src/System/Text/Json/Serialization/Metadata/JsonTypeInfoOfT.WriteHelpers.cs
Show resolved
Hide resolved
supportAsync: true); | ||
|
||
using var bufferWriter = new PooledByteBufferWriter(Options.DefaultBufferSize); | ||
using var writer = new Utf8JsonWriter(bufferWriter, Options.GetWriterOptions()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Has any thought been put into making the cache use a ConcurrentQueue
instead so it can be used in async paths?
Example: https://github.com/dotnet/aspnetcore/blob/main/src/Servers/Kestrel/Transport.Sockets/src/Internal/SocketSenderPool.cs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Given that the perf benefits are fairly modest when doing thread-local caching in the sync methods, my expectation is that any mechanism requiring more elaborate synchronization would likely offer diminishing returns.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The big gain we've seen in our JSON web benchmarks is that by reducing the allocations of the Utf8JsonWriter
(aka pooling them) it greatly reduces the working set of the application, from over 400 MB to ~60 MB for a test run doing millions of RPS.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The big gain we've seen in our JSON web benchmarks is that by reducing the allocations of the
Utf8JsonWriter
(aka pooling them) it greatly reduces the working set of the application, from over 400 MB to ~60 MB for a test run doing millions of RPS.
Do we know how much of that would also be achieved by changing GC configuration to make it more aggressive rather than being ok expanding to the memory available?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@BrennanConroy could we do a run of the JSON middleware benchmark with Workstation GC to compare?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you share the benchmark code? If run in .NET 7, I would expect any perf issues related to Utf8JsonWriter
allocations to have dissipated because of #73338.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
https://github.com/aspnet/Benchmarks/blob/main/src/Benchmarks/Middleware/JsonMiddleware.cs#L56
It's using the Stream methods which weren't affected until this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What about the variant that caches Utf8JsonWriter
instances?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Still streaming API, just the synchronous one
aspnet/Benchmarks#1772
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Utf8JsonWriter
overload is not doing streaming serialization, it will try to write everything to the destination buffer in one go. Like I said, this is not an apples-to-apples comparison, streaming mode requires more work and calls into a "slow path" for converters that do support streaming.
Using Utf8JsonWriter
instead of Stream
is still a valid approach, however it does trade off performance for small vs. large JSON payloads. Hopefully the changes implemented in this PR can let you have the best of both worlds by just calling into the streaming APIs (provided you're using source gen).
The recursion would happen if a gshared type would contain a recursive reference to it. It was triggered by the JsonTypeInfo:JsonTypeInfo<Queue<T>> field added by dotnet#78646. Fixes dotnet#79279.
This PR refactors the root-level serialization methods so that source generated fast-path serialization delegates can be utilized in more places, including
object
values and in the streaming serialization methods. Specifically for the streaming methods, a heuristic is implemented where fast-path serialization kicks in if the following conditions are satisfied for a given type:JsonSerializerOptions.DefaultBufferSize / 2
bytes ANDThis is to prevent excessive buffering since the fast path serializer doesn't support streaming.
Performance
The change shows notable performance improvements in streaming serialization workloads:
WriteJson_LoginViewModel
WriteJson_LargeStructWithProperties
WriteJson_Dictionary
WriteJson_ImmutableDictionary