Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Firefox memory leak investigation Kibana 7.6 #59454

Closed
marius-dr opened this issue Mar 5, 2020 · 5 comments
Closed

Firefox memory leak investigation Kibana 7.6 #59454

marius-dr opened this issue Mar 5, 2020 · 5 comments
Assignees
Labels
Meta performance Team:QA Team label for QA Team

Comments

@marius-dr
Copy link
Member

This is sort of a large meta investigation issue on Kibana's browser memory usage.
Started after the Kibana CI tests were running in to "Out of memory" errors during the Firefox tests.
Initial runs show a big disparity between Firefox and Chrome, but more important is the slope of the graph for VSZ, as it keeps going higher for Firefox, compared to the uniformity for Chrome.
memory usage ff

Older Firefox version shows a lower all around usage, but there is still an increase so this points to a memory leak of some kind.
memory old FF
The difference between the firefox versions can be sort of explained by this issue: https://bugzilla.mozilla.org/show_bug.cgi?id=1608501

What we've looked into until now:

  • the Memory tab in Firefox: it looks like the snapshot it takes is only of the RSS memory, so it won't help very much here.
  • the about:memory tool: pretty complicated to figure out, but it might be worth using more unless we find a better tool.

What's planned to do (unless we get better ideas):

  • run the Kibana functional tests per each app, 10 times each and see which has the largest increase between start and end.
  • run about:memory at the start of the tests, run it at the end as well and then compare the two saved results manually

How we've tested it until now:

  • setup Metricbeat on a machine in order to gather the usage data for Firefox during the run
  • run the Firefox tests:
    node scripts/functional_tests_server.js
    node scripts/functional_test_runner.js --config test/functional/config.firefox.js
  • look at the logs from Metricbeat in another Kibana instance (I've used a cloud one in order to not influence the run).
@marius-dr marius-dr added Team:Operations Team label for Operations Team Team:QA Team label for QA Team Meta labels Mar 5, 2020
@marius-dr marius-dr self-assigned this Mar 5, 2020
@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-qa (Team:QA)

@elasticmachine
Copy link
Contributor

Pinging @elastic/kibana-operations (Team:Operations)

@LeeDr
Copy link

LeeDr commented Mar 5, 2020

Here's the commit where the tests were disabled. fe38642
I don't see any other issues showing logs from the Jenkins failures. We might have to create a PR to turn the Firefox tests back on to capture details.

@marius-dr
Copy link
Member Author

Over the weekend I ran each test suite from the functional tests 15 times in order to narrow which area causes the most trouble on Firefox:

  1. Dev console tests. Everything looked decent, within the parameters.

1console_app_tests

2. Getting started tests. Good.

2getting_started_tests

3. Log Context tests. Huge spike there.

3context_tests

4. Dashboard tests. Some spikes but acceptable.

4dashboard_tests

5. Discover tests. Going up consistently, not very good.

5discover_tests

6. Home tests. This was the worst of all, memory usage kept going up and up until it crashed with a `Detected an unhandled Promise rejection. out of memory"` error

6home_tests_crashed

7. Manangement app tests. All good.

7management_tests

8. Status page tests. Too small to be relevant.

8status_page_tests

9. Timelion tests. Decent. (should be changed to Timelion viz tests as the app should be going out soon).

9timelion_tests

10 Visualizations tests. Some increases, but incredibly long test, so I think it fared average.

10viz_tests

@lizozom
Copy link
Contributor

lizozom commented Apr 18, 2022

Closing this issue due to inactivity.
Feel free to reopen if needed 🙏🏻

@lizozom lizozom closed this as completed Apr 18, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Meta performance Team:QA Team label for QA Team
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants