-
Notifications
You must be signed in to change notification settings - Fork 10k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory leak when restarting host on aspnet core 5 #31125
Comments
How much memory after how many cycles? To prove it's a leak we'd want to see that it does not eventually plato or drop after hundreds of cycles. A GC does not guarantee process memory is returned to the OS. Capturing a dump between cycles would be the best way to identify any leaked objects. |
I added a new endpoint witch dos not stop the app and made 100 calls to make sure the app was warmup, took a snapshot of the memory. Then I ran 100 more calls every 5 seconds to the endpoint that stops the app and it gets restarted and took another snapshot at the end. This was the result: (I don't know if there is an option to export it?) That is with the call to GC. The same approach but without |
There's a known issue where ASP.NET Core doesn't clean up MemoryPoolSlabs #27394 but your issue I'm guessing is because you're holding onto references of the host on the stack in Main. How many host instances are hanging around? Are any of them rooted? Disposing the host should clean up the entire memory pool leaving them memory to be reclaimed by the GC (eventually). Beyond that, there's more advanced analysis you can do to figure out why memory is being held onto (getting a memory dump after a gc and using dotnet dump analyze to look at the heap). |
It sounds like you ran hundreds of iterations but the only thing with hundreds of instances in the Count column is RuntimeType. |
I should have ordered it, sorry about that, but this screenshot shows a comparison between 100 calls to stop and restart and then after 100 more. The server started at 80MB and after 200 calls it is on 160MB. @Tratcher You are right RuntimeType is the one with more instances on count. Is that ok? No numbers there actually make sense compared to the number of cycles (200 now). I mean it does not show objects hanging around that match the number of calls. @davidfowl it shows only 1 instance of the Host. I'll do what you suggested and get a memory dump after GC. |
I don't know what RuntimeType is for, but since it says the count diff is going down rather than up there's not a persistent leak.
The process can reserve more memory than it's currently consuming. Cleaning up objects and releasing memory back to the OS are decoupled concerns. |
RuntimeType is just the implementation of the abstraction Type. |
After running
|
Is it a one-to-one growth? After 100 cycles you have 100 file watchers? What are the GC roots for the watchers? |
Where are the object counts?
|
@Tratcher It's a 1-1 growth, but I got 101 instances because of the first run (I'm counting cycles as stop and run again).
@davidfowl These are the files, I ordered it by count. |
Maybe related: #3564. Unfortunately I can't find where that issue ended up. |
I'll take a look |
@Leandro-Albano Are you using IIS express or Kestrel? |
I'm running on Kestrel, on IIS Express I just tried and it seems to not work, the restart never happens because an exception is thrown |
Yea, we have a leak here. I'm gonna fix it 😄. |
I'd be surprised if IIS in-proc could be restarted like this, it's too tangled up in the IIS state. |
True, IIS would need to unwind to native code. |
@davidfowl #31170 addresses DiagnosticListener, but what about the FileSystemWatcher? |
I have another PR in runtime coming soon |
Fixed |
Awesome job guys, thanks a lot! |
No thank you for finding that bug. It's been there for ages 🤣 |
Describe the bug
After calling
IHostApplicationLifetime.StopApplication()
I am trying to build and start the IHost again. Inspecting the diagnostic tools on visual studio I can see that the memory goes up (a bit) every cycle, even trying to callSystem.GC.Collect()
.To Reproduce
Further technical details
dotnet --info
VS 2019 (16.8.6)
The text was updated successfully, but these errors were encountered: