Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Server Streaming - very poor performance on laggy connections (HttpSys) #1971

Closed
LaurensVergote opened this issue Dec 6, 2022 · 4 comments
Closed
Labels
bug Something isn't working

Comments

@LaurensVergote
Copy link

LaurensVergote commented Dec 6, 2022

What version of gRPC and what language are you using?

Language: C#
gRPC version:

  • Grpc.AspNetCore (2.50.0)
  • Grpc.AspNetCore.Web (2.50.0)

What operating system (Linux, Windows,...) and version?

Windows 10 Enterprise Version 22H2
OS Build: 19045.2251

What runtime / compiler are you using (e.g. .NET Core SDK version dotnet --info)

.NET version: 6.0.11

What did you do?

If possible, provide a recipe for reproducing the error. Try being specific and include code snippets if helpful.
When simulating a laggy connection, server streaming is extremely slow when "larger" payloads are used ( > 4kb)
In my following samples, I have used clumsy to apply a 250ms Lag (Inbound and Outbound) and a 2% Drop (Inbound and Outbound) on a specific port that I use for my grpc connection (1080).

In HttpContextSerializationContext.cs we find that grpcdotnet uses the .Write() extension method on the PipeWriter. This behaviour has been simulated in the project you can find here.
When using this method, in conjunction with HttpSys and a laggy connection, performance takes an enormous hit.

Some samples recorded locally of requesting a 1MB payload:

Description Average speed (kbps) Time Spent
WriteAsync() + HttpSys 452 0:02
Write() + HttpSys 6,8 02:32
WriteAsync() + Kestrel 254 0:04
Write() + Kestrel 300 0:03

In order to get HttpSys to work you might have to run a netsh command:
netsh http add urlacl url="http://+:1080/" user="DOMAIN\USER"

You can toggle between using Kestrel and HttpSys by quoting the line webBuilder.UseHttpSys(); in Program.cs

Once the application is running you can use curl to access the API
curl -o /dev/null 'http://localhost:1080/blobSync?writeSize=1M'
curl -o /dev/null 'http://localhost:1080/blobAsync?writeSize=1M'

The difference should be obvious.
This sample uses localhost only (which exaggerates the problem a bit), but it should be easy enough to run the server and the curl command on different machines. The performance difference will still be very noticeable.
I do know that Kestrel is the preferred server for grpc, but due to some legacy on our side, we have to use HttpSys for the forseeable future.
Additionally, in Startup.cs, you will find a boolean private bool EnableHack = false; which will activate a "hack" on the internal PipeWriter to increase its minimum buffer size using Reflection. While this does help with the performance, it is still not on par with what we can see when using PipeWriter.WriteAsync() instead.
I'm aware the sample code does not actively use grpc, but I believe this simulates it best and is easiest to follow.

What did you expect to see?

Acceptable performance (seen when using WriteAsync()) when the user has chosen HttpSys.

What did you see instead?

Unacceptably slow performance when using HttpSys.

Make sure you include information that can help us debug (full error message, exception listing, stack trace, logs).

See TROUBLESHOOTING.md for how to diagnose problems better.

Anything else we should know about your project / environment?

Reproduction repo: https://github.com/LaurensVergote/HttpSysLaggyPerformance

@LaurensVergote LaurensVergote added the bug Something isn't working label Dec 6, 2022
@LaurensVergote LaurensVergote changed the title Very poor performance on laggy connections (HttpSys) Server Streaming - very poor performance on laggy connections (HttpSys) Dec 7, 2022
@adityamandaleeka
Copy link

adityamandaleeka commented Dec 7, 2022

@LaurensVergote Thanks for the detailed issue. Your analysis makes sense. The server streaming workflow you're using is particularly susceptible to this issue because it will cause a flush to occur on every message.

Can you please create a new GitHub repo with your sample? We are not able to accept zip files.

We have an existing issue in the aspnetcore repo for enabling kernel response buffering for HTTP.sys and it sounds like your example might be a good test case for that: dotnet/aspnetcore#14455

@LaurensVergote
Copy link
Author

@adityamandaleeka Thanks for the reply. I did see dotnet/aspnetcore#14455 but wasn't entirely sure this was the same thing.
After reading through the comments on that issue, I do believe this is the exact same.

Since the issue on the dotnet/aspnetcore project is quite old already, do you have any idea if this is still in the pipeline to be fixed? Or do you think this could be better fixed in grpc-dotnet itself, or shelved as legacy and refer users to use Kestrel instead?

The original post has been updated with a repo link where you can find a small reproduction project
https://github.com/LaurensVergote/HttpSysLaggyPerformance

@adityamandaleeka
Copy link

Thank you for creating the repo @LaurensVergote! Yes, we still plan to fix that response buffering issue. Just need to find the time 😅.

@JamesNK
Copy link
Member

JamesNK commented Dec 8, 2022

I'm closing this issue as the fix is in ASP.NET Core http.sys integration. It's not something specific to grpc-dotnet.

@JamesNK JamesNK closed this as completed Dec 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants