-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
grpc server memory leak #797
Comments
It seems caused by hyper. |
Seems that I stuck with the same issue as well. I've tried to debug your example with console-subscriber and I found out remarkably increasing amount of task after some time (~30 secs). After client finishing the task amount doesn't decrease. |
It's a good idea to use console. I'll try to dig into it later. |
This is probably a better issue for hyper so might be good to create a minimal reproduction. |
@LucioFranco by minimal reproduction, do you mean of In order for diff --git a/examples/src/helloworld/server.rs b/examples/src/helloworld/server.rs
index c6398bb..6502e61 100644
--- a/examples/src/helloworld/server.rs
+++ b/examples/src/helloworld/server.rs
@@ -27,6 +27,7 @@ impl Greeter for MyGreeter {
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
+ console_subscriber::init();
let addr = "[::1]:50051".parse().unwrap();
let greeter = MyGreeter::default(); The memory usage |
Actually, I think I just forgot how to read My apologies. The memory usage does still seem to grow over time even with the simple |
Not sure if this applies to OP, but I think my issue at least was actually due to expected behavior within Running the example as-is (i.e., without |
Is there any solution/workaround to this? |
Unless someone can actually create a minimal reproducer, I don't think there is actually a memory leak here that needs to be fixed. I am gong to close this issue feel free to post any reproducer and we can reopen. |
Bug Report
Version
Platform
Description
tonic
grpc server takes a lot of memory under high load and end up killed by the kernel for OOM.The memory usage didn't drop even after the client stopped.
I made a simple demo with
tonic
related code only:https://github.com/whfuyn/grpc-test
Other info:
[tokio::main(flavor = "current_thread")]
and single client instance, but still occur when there are multiple clients (2 in my test).max_concurrent_streams
is set, but results in some client requests timeout.Also post the demo code here:
proto file
server
client, run multiple instances to reproduce this quickly.
The text was updated successfully, but these errors were encountered: