Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ASPNET Core] Memory Usage / GC Thrashing #1270

Closed
ctolkien opened this issue Oct 24, 2019 · 16 comments
Closed

[ASPNET Core] Memory Usage / GC Thrashing #1270

ctolkien opened this issue Oct 24, 2019 · 16 comments
Assignees

Comments

@ctolkien
Copy link

Fairly basic ASPNET Core Application, 1 minute after booting sitting there idling, there is a huge amount of GC thrashing and the memory is nearly peaking at 600mb:
image

Compared with if AI is not enabled, no GC thrashing, memory usage at 150mb:
image

Only difference is:
services.AddApplicationInsightsTelemetry();

@ctolkien
Copy link
Author

ctolkien commented Oct 25, 2019

Digging in via perfview:
image

@ctolkien
Copy link
Author

@davidwengier
Copy link

I see the same thing on my machine with that repro.

The GCs are caused by LOH allocations, which in my case come from System.Diagnostics.PerformanceMonitor.PerformanceCounterLib.GetData(). The byte array coming back is huge:
image
(RegistryKey.cs also initializes a few more arrays that end up on the LOH on its way to getting one that big, which is compounding the problem)

item in the screenshot is "230" which is the Process counter.

@ScottHolden
Copy link

Spinning up a new ASP.Net Core WebAPI, and running under Kestral gives a full repo:

image

If we remove just the "\Process" related counters, we don't see this issue:

image

@xt0rted
Copy link

xt0rted commented Oct 25, 2019

With the following changes (based on the discussion in https://github.com/microsoft/ApplicationInsights-aspnetcore/issues/912) I'm not seeing this go over 50 MB where as before these changes it was hitting 500 MB.

image

app.csproj

 <PropertyGroup>
   <TargetFramework>netcoreapp2.2</TargetFramework>
+  <ServerGarbageCollection>false</ServerGarbageCollection>
 </PropertyGroup>

runtimeconfig.template.json

{
  "configProperties": {
    "System.GC.RetainVM": false
  }
}

Changing each of the System.GC.Server and System.GC.RetainVM settings yields varying results, all of which are improvements, but setting both to false gives the best.

@ScottHolden
Copy link

I've also noted that it is not one particular counter, but in fact anything to do with \Process():
image
image

@TimothyMothra
Copy link
Member

@cijothomas can you take a look and comment?

@cijothomas cijothomas transferred this issue from another repository Oct 28, 2019
@cijothomas
Copy link
Contributor

Thanks for reporting the issue with repros. Yes this is easy to repro, and reproes in all sdk versions, and in both .net core and .net framework apps. This is not a bug inside the sdk itself, but from the perfcounter library. Trying to find owners to get this tracked/fixed appropriately.

@ctolkien
Copy link
Author

ctolkien commented Nov 4, 2019

Hi @cijothomas - I recognise this isn't an issue in AppInsights directly, but net result is this is slowing our devs down, so we're yanking AI out.

@jeffadavidson
Copy link

@cijothomas do you any update on this?

@Dmitry-Matveev
Copy link
Member

@cijothomas , assigned to you to check on the status of your conversation with the code owners.

@evilpilaf
Copy link

This looks exactly like the issue I'm facing and describe here
#1678 (comment)
Is there any ETA on it?

@cijothomas
Copy link
Contributor

@evilpilaf This issue is arising from PerformanceCounter.
The other issue and the screenshot share appears from DiagnosticSource related. They are separate problems.

@SidShetye
Copy link

Seeing the same on .NET Core 3.1 and Microsoft.ApplicationInsights.AspNetCore v2.13.1.

image

If you don't need the AppInsight collected metrics you can disable that module in your startup.cs's ConfigureServices(IServiceCollection services)

services.AddApplicationInsightsTelemetry(opt => {
	opt.EnablePerformanceCounterCollectionModule = false;
});

Obviously this is a work-around till the real issue is resolved but it works for us since we're gathering similar metrics at our hosting layer.

@cijothomas
Copy link
Contributor

This is fixed #1694 from 2.14.0-beta1 onwards.

@ojpbay
Copy link

ojpbay commented Mar 15, 2021

@cijothomas - can you please confirm the exact version of the framework this fix went into. We have current production apps that suffer from similar issues to those mentioned by @evilpilaf and they are on runtime 3.1.112 (build agents are on SDK v3.1.404).

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests