-
Notifications
You must be signed in to change notification settings - Fork 0
Sentry upgrade from 7.118.0 to 8.26.0 leak memory #18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hey thanks for writing in! We'll look into your issue next week as this week is Hackweek at Sentry (see getsentry#13421). |
Hi, would you be able to provide a memory snapshot with the Node/v8 profiler so that we can look at what is holding the references causing the leak? Feel free to also shoot us an email or twitter dm if you don't want to publicly share it. Thanks! |
Click the checkbox below to generate a PR!
@hiroshinishio, You have 2 requests left in this cycle which refreshes on 2024-11-04 02:56:25+00:00. |
Click the checkbox below to generate a PR!
@hiroshinishio, You have 3 requests left in this cycle which refreshes on 2024-11-05 22:45:17+00:00. |
I can't give you an snapshot because I have a lot of private information. |
I made a snapshot, I'll examine it and show you |
Set, Span and NonRecordingSpan are from Sentry |
Yeah that looks like Sentry. Would you mind digging around a bit ant examine what holds the references to the spans? |
This indicates that Sentry objects are retained in memory. It doesn't mean they are causing this retention though! |
|
I don't think so. Before switching to sentry v8, someone warned me about the memory leak. |
I have a few questions to narrow this down further. I am not ruling out that our SDK is causing the leak:
|
Would you mind answering the questions I asked. It's important we have answers to them so we can rule out certain things. |
I've already double-checked my code |
I am talking about Sentry spans. Which you would start with Do you happen to have very long running requests in your process? Like server-sent-events, or streaming? |
On my side I never call Sentry.startSpan() |
Do you have any very long running requests / request handlers in your code? |
I wouldn't say long requests, but I have a lot of requests per second, over 110req/s at times, and I receive a lot of messages via websocket. |
I believe you saying this is correlated with the update. Can you share a bit more about your program architecture? Is discord opening a websocket request on your server or is your server making a websocket request to discord? Also, are you initializing Sentry more than once per process? |
From what I can tell from your profiler screenshots, something is creating a Span (a root span) that seemingly never finishes, and spans keep getting attached to that span and that ends up leaking memory. It would be nice to be able to figure out, what it is that is creating this root span. |
I'll try to remove the profillerRate line, and we'll see if it comes from that. |
If you could somehow also try logging |
Sentry.getRootSpan() requires and argument |
Sorry right. Can you try
|
On the active process i used eval() NonRecordingSpan {
_spanContext: {
traceId: 'ba6488d048422cfba347c2a2b9b1eca5',
spanId: '5df94fad8fd85299',
traceFlags: 0,
traceState: [TraceState]
}
} |
Thanks! We are struggling trying to understand what is creating this non-recording Span. Usually that shouldn't happen unless you set a traces sample rate of <1 or you manually continue a trace. Can you try setting |
can I enable logs without restarting my process ? |
I don't think so :/ |
Another thing that we just noticed: Have you properly followed the migration guide for how and when to call |
in my case I don't require sentry first |
can you try doing so? |
yes, there is a set of integrations that are enabled by default, but don't worry, as long as you do not use (ie import/require) these packages these integrations are no-ops. |
I can't make out from the screenshot what you did tbh. If you are still importing other things inside |
That setup looks good 👌 Can you share more of the debug logs? It would be good to see logs up until and including when you do things like send requests and database queries and similar. Thanks! |
Would you mind sharing the start of your application up to a certain point in text format? Thanks! |
Do you have discord ? |
Yes! Feel free to join https://discord.com/invite/sentry and ping `` |
After some back and fourth we have discovered that the memory leak in this issue happens due to the The workaround for now is to do the following: Sentry.init({
integrations: [
Sentry.httpIntegration({
ignoreOutgoingRequests(url, request) {
return true;
},
}),
],
}) Thanks for the collaboration !! Action items (varying degrees of possible):
|
No problem 👍 |
Sorry, we have an error. Please try again. Have feedback or need help? |
1 similar comment
Sorry, we have an error. Please try again. Have feedback or need help? |
Sorry, we have an error. Please try again. Have feedback or need help? |
Is there an existing issue for this?
How do you use Sentry?
Self-hosted/on-premise
Which SDK are you using?
@sentry/node
SDK Version
8.26.0
Framework Version
No response
Link to Sentry event
No response
Reproduction Example/SDK Setup
my sentry init in 7.118.0
my sentry init in 8.6.0
Steps to Reproduce
I'll try removing some of the integrations to see what's causing the problem.
Expected Result
A normal memory usage
Actual Result
anormal memory usage
The text was updated successfully, but these errors were encountered: