Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add conditional on TraceWebFilterAutoConfiguration #8322

Closed
jkubrynski opened this issue Feb 17, 2017 · 16 comments
Closed

Add conditional on TraceWebFilterAutoConfiguration #8322

jkubrynski opened this issue Feb 17, 2017 · 16 comments

Comments

@jkubrynski
Copy link
Contributor

jkubrynski commented Feb 17, 2017

The WebRequestTraceFilter introduces big gc/memory footprint mostly due to Map operations. It's enabled by default as there is no condition on TraceWebFilterAutoConfiguration. It'd be great to be able to disable it in a simple way as it's rarely being used.

@spring-projects-issues spring-projects-issues added the status: waiting-for-triage An issue we've not yet triaged label Feb 17, 2017
@snicoll
Copy link
Member

snicoll commented Feb 17, 2017

It looks like it duplicates #2921 somehow

@wilkinsona
Copy link
Member

It's a little clunky, but you can disable the filter by creating a FilterRegistrationBean for it and marking it as disabled. Something like this:

@Bean
public FilterRegistrationBean webRequestTraceFilterRegistration(
		WebRequestTraceFilter filter) {
	FilterRegistrationBean registration = new FilterRegistrationBean(filter);
	registration.setEnabled(false);
	return registration;
}

@jkubrynski
Copy link
Contributor Author

Right @wilkinsona -> but as there is already property which enables /trace endpoint I think it should be reused on the Filter as well

@wilkinsona
Copy link
Member

I'm not sure I agree with that. You may want to use the repository, which is populated by the filter, without having the endpoint enabled.

@jkubrynski
Copy link
Contributor Author

It looks like an extremely rare use case. But even then you can just register the filter manually. Enabling it always when it has big overhead is bad idea.

@wilkinsona
Copy link
Member

wilkinsona commented Feb 18, 2017

It looks like an extremely rare use case

We don't have any usage data at that level so I have no idea.

Enabling it always when it has big overhead is bad idea

Given that you are the first person to raise this, I doubt that it has a big overhead for most users of the Actuator.

as it's rarely being used.

How do you know that?

@wilkinsona
Copy link
Member

Actually, I think we need to wind this one right back to the start.

big gc/memory footprint mostly due to Map operations

Can you please quantity "big"? Without some data, it's impossible to know if there's a need to do anything at all here. With some data, we might be able to optimise WebRequestTraceFilter for everyone rather than just providing a switch to turn it off.

@wilkinsona wilkinsona added the status: waiting-for-feedback We need additional information before we can continue label Feb 18, 2017
@marcingrzejszczak
Copy link
Contributor

@wilkinsona check this issue - spring-cloud/spring-cloud-sleuth#522

@spring-projects-issues spring-projects-issues added status: feedback-provided Feedback has been provided and removed status: waiting-for-feedback We need additional information before we can continue labels Feb 18, 2017
@wilkinsona
Copy link
Member

@marcingrzejszczak I already have. I only see two lines of data there that directly relate to the filter in question here and I'm not sure how to interpret them

@wilkinsona wilkinsona added status: waiting-for-feedback We need additional information before we can continue and removed status: feedback-provided Feedback has been provided labels Feb 18, 2017
@marcingrzejszczak
Copy link
Contributor

AFAIR from the analysis it turned out that that trace actuator filter is caching a lot of objects related to request / response header retrieval. Due to this a lot of objects were created and the result was leading to a much longer garbage collection time.

@spring-projects-issues spring-projects-issues added status: feedback-provided Feedback has been provided and removed status: waiting-for-feedback We need additional information before we can continue labels Feb 18, 2017
@wilkinsona
Copy link
Member

Sorry, that still doesn't really quantify things. What, exactly, were the requests involved and what does "much longer" mean? I can see some object creation that could be avoided but I still haven't seen anything that tells us it's worth worrying about.

@wilkinsona wilkinsona added status: waiting-for-feedback We need additional information before we can continue and removed status: feedback-provided Feedback has been provided labels Feb 19, 2017
@marcingrzejszczak
Copy link
Contributor

marcingrzejszczak commented Feb 19, 2017

As for what the app does - it's a really simple application - https://github.com/marcingrzejszczak/sleuth-memory-benchmarks/blob/master/sleuth-application/src/main/java/com/example/SleuthApplication.java . In the tests we were sending a request to the test endpoint.

AFAIR the stats are concerned - they're written in the other issue - let me copy it here.

Without the Actuator dependency we don't have the Web Trace Filter and the GC time is stable as you can see (increase of GC time is 17-20% regardless of the number of requests)

WITHOUT actuator

These are the results for application WITHOUT the actuator dependency:

For [100] requests...
99th percentile of memory usage for a non sleuth app is [21203.5]
99th percentile of memory usage for a sleuth app is [36477.7]
The Sleuth app is using [15274.2] more memory which means a increase by [72.00%]
GC time for non sleuth app [6.782]
GC time for sleuth app [7.966]
The Sleuth app needs [1.184] more time (in seconds) which means an increase by [17.400%]
For [500] requests...
99th percentile of memory usage for a non sleuth app is [21980.5]
99th percentile of memory usage for a sleuth app is [39391.1]
The Sleuth app is using [17410.6] more memory which means a increase by [79.00%]
GC time for non sleuth app [34.114]
GC time for sleuth app [40.840]
The Sleuth app needs [6.726] more time (in seconds) which means an increase by [19.700%]
For [2000] requests...
99th percentile of memory usage for a non sleuth app is [22013.0]
99th percentile of memory usage for a sleuth app is [39042.9]
The Sleuth app is using [17029.9] more memory which means a increase by [77.00%]
GC time for non sleuth app [147.288]
GC time for sleuth app [173.700]
The Sleuth app needs [26.412] more time (in seconds) which means an increase by [17.900%]

WITH actuator

These are the results for application WITH the actuator dependency. When we do have the Web Trace Filter in action then the GC time is getting larger with the increased number of requests.

For [100] requests...
99th percentile of memory usage for a non sleuth app is [20869.7]
99th percentile of memory usage for a sleuth app is [46126.6]
The Sleuth app is using [25256.9] more memory which means a increase by [121.00%]
GC time for non sleuth app [6.874]
GC time for sleuth app [10.188]
The Sleuth app needs [3.314] more time (in seconds) which means an increase by [48.200%]
For [500] requests...
99th percentile of memory usage for a non sleuth app is [21935.7]
99th percentile of memory usage for a sleuth app is [51122.7]
The Sleuth app is using [29187.0] more memory which means a increase by [133.00%]
GC time for non sleuth app [34.286]
GC time for sleuth app [49.814]
The Sleuth app needs [15.528] more time (in seconds) which means an increase by [45.200%]
For [2000] requests...
99th percentile of memory usage for a non sleuth app is [21680.7]
99th percentile of memory usage for a sleuth app is [63588.3]
The Sleuth app is using [41907.6] more memory which means a increase by [193.00%]
GC time for non sleuth app [115.881]
GC time for sleuth app [201.294]
The Sleuth app needs [85.413] more time (in seconds) which means an increase by [73.700%]

@spring-projects-issues spring-projects-issues added status: feedback-provided Feedback has been provided and removed status: waiting-for-feedback We need additional information before we can continue labels Feb 19, 2017
@wilkinsona
Copy link
Member

wilkinsona commented Feb 19, 2017

Thanks but, as I said above, I've already read that data and none of it is specifically about the web request trace filter. Can you share the data that let you make the link between the increase in GC time and the web request trace filter? Right now, for all I know, there could be other parts of Actuator that are to blame and this issue is very specifically about one particular filter.

It's also interesting that if I compare the with and without Actuator numbers without Sleuth, the GC time is pretty much constant (it evens drops substantially in the 2000 requests case). Were the two sets of numbers gathered in such a way that it's fair to compare them? If so, it would suggest that a combination of something in Actuator and something in Sleuth causes an increase in GC time. I think that warrants further investigation and is at odds with blanket statements like "enabling it always when it has big overhead is bad idea."

@wilkinsona wilkinsona added status: waiting-for-feedback We need additional information before we can continue and removed status: feedback-provided Feedback has been provided labels Feb 19, 2017
@marcingrzejszczak
Copy link
Contributor

I see. Sure, I'll run some more tests and provide more precise data.

I've executed the tests on the same machine that was doing nothing else than that so the conditions were comparable.

@spring-projects-issues spring-projects-issues added the status: feedback-provided Feedback has been provided label Feb 19, 2017
@spring-projects-issues spring-projects-issues removed the status: waiting-for-feedback We need additional information before we can continue label Feb 19, 2017
@wilkinsona wilkinsona added status: waiting-for-feedback We need additional information before we can continue and removed status: feedback-provided Feedback has been provided labels Feb 20, 2017
@spring-projects-issues
Copy link
Collaborator

If you would like us to look at this issue, please provide the requested information. If the information is not provided within the next 7 days this issue will be closed.

@spring-projects-issues spring-projects-issues added the status: feedback-reminder We've sent a reminder that we need additional information before we can continue label Feb 27, 2017
@spring-projects-issues
Copy link
Collaborator

Closing due to lack of requested feedback. If you would like us to look at this issue, please provide the requested information and we will re-open the issue.

@spring-projects-issues spring-projects-issues removed status: waiting-for-feedback We need additional information before we can continue status: feedback-reminder We've sent a reminder that we need additional information before we can continue labels Mar 6, 2017
@snicoll snicoll removed the status: waiting-for-triage An issue we've not yet triaged label Mar 13, 2017
ColinHarrington added a commit to ColinHarrington/spring-boot that referenced this issue Mar 16, 2017
…stTraceFilter) following the pattern in MetricFilterAutoConfiguration spring-projects#8322
ColinHarrington added a commit to ColinHarrington/spring-boot that referenced this issue Mar 16, 2017
…stTraceFilter) following the pattern in MetricFilterAutoConfiguration spring-projects#8322
ColinHarrington added a commit to ColinHarrington/spring-boot that referenced this issue Mar 20, 2017
…stTraceFilter) following the pattern in MetricFilterAutoConfiguration spring-projects#8322
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants