Skip to content
This repository has been archived by the owner on Oct 12, 2022. It is now read-only.

High memory usage #310

Open
ehames opened this issue Aug 23, 2018 · 13 comments
Open

High memory usage #310

ehames opened this issue Aug 23, 2018 · 13 comments

Comments

@ehames
Copy link

ehames commented Aug 23, 2018

I'm using Atom to and its go-langserver integration. Memory usage is close to 18Gb now as I type this. I cannot capture the memory/CPU profiles because the port is not open.

$ go tool pprof -svg $GOPATH/bin/go-langserver http://localhost:6060/debug/pprof/profile > cpu.svg
Fetching profile over HTTP from http://localhost:6060/debug/pprof/profile
http://localhost:6060/debug/pprof/profile: Get http://localhost:6060/debug/pprof/profile: dial tcp [::1]:6060: connect: connection refused
failed to fetch any source profiles

screen shot 2018-08-23 at 11 12 00 am

screen shot 2018-08-23 at 11 09 02 am

BTW, this was triggered when I changed a function name, and many compilation errors were triggered. After 4 or 5 minutes, memory went down to 1Gb, but my computer was quite slow in the meantime.

@joshua
Copy link

joshua commented Aug 23, 2018

I'm experiencing the same issue but also with very high CPU usage as well. This issue is a lot more noticeable with diagnostics enabled.

@keegancsmith
Copy link
Member

aah, yes this is very likely related to having diagnostics enabled. When you enable diagnostics it enables the typechecker, which can be a huge memory hog. Unfortunately there is no quick fix here, other than disabling diagnostics.

@lloiser
Copy link
Contributor

lloiser commented Aug 27, 2018

Note: as stated by @keegancsmith this happens if the typechecker gets started, which happens quite a few times: diagnostics, references, implementation... see https://sourcegraph.com/github.com/sourcegraph/go-langserver@7df19dc017efdd578d75c81016e0b512f3914cc1/-/blob/langserver/loader.go#L27:23&tab=references

@keegancsmith
Copy link
Member

yeah this is a pretty bad problem for us, since the typechecking is so useful :) I think the future is bright (once we have time to implement it), since the caching stuff in go has a lot more useful information for us, which means we can probably rely on the on-disk caching go has now.

@ehames
Copy link
Author

ehames commented Aug 28, 2018

This seems to be closely related to #209. Both issues are due to the typechecker.

@harikb
Copy link

harikb commented Nov 1, 2018

image
I have to periodically kill the language server and live with missing features with vscode. Is there anything I can help with? what logs/traces can I extract the next time this happens?

@EmpireJones
Copy link

Having the same issue here. It sometimes takes up all available memory (e.g. 30GB), resulting in OS freezing. Just a guess, but this feels more like a bug than an issue of something being inefficient. Any details I can provide?

@slimsag
Copy link
Member

slimsag commented Nov 15, 2018

You can set "go.languageServerFlags": ["-pprof", ":6060"] in your VS Code settings and then follow the steps in the README to capture a heap profile and upload the SVG. That would tell us where the memory is allocated. I agree this looks more like a regression, but we can't know without more details I think.

If the memory usage is coming from typechecking and not a regression in e.g. leaking memory, then we likely cannot do anything yet. The long term fix for this will be in the official Go language server which the Go developers are working on actively (it is a difficult problem to solve).

@doxxx
Copy link

doxxx commented Nov 16, 2018

I've been using the language server since yesterday and it was relatively well behaved, using up to only a few hundred MB. This morning, I started making some edits and the language server started consuming 80-100% CPU and the memory spiked up to 5GB. I managed to capture a heap snapshot: heap.zip

Also managed to catch the tail end of the CPU spike: cpu.zip. It looks like it might just be the heap collector though. If it happens again I'll try collect a CPU profile first.

I should also note that this only lasted about a minute or two, and the CPU and memory usage dropped down again.

@doxxx
Copy link

doxxx commented Nov 16, 2018

This is probably a better CPU profile than previous one: cpu.zip

@doxxx
Copy link

doxxx commented Nov 16, 2018

This time the heap grew to 10GB: heap.zip

@slimsag
Copy link
Member

slimsag commented Nov 17, 2018

@doxxx

  • Your first heap profile shows the langserver using 1.3GB, and Go having allocated 2.4GB (1.1GB unused but not yet released back to the OS).
  • Your second heap profile shows the langserver using 4GB, and Go having allocated 8GB (again, 4GB unused but not yet released back to the OS).

Both traces show the memory was allocated in the golang.org/x/tools/go/loader package, which is the entrypoint for type checking. This is unfortunate and a known issue, but expected currently. It'll improve in the future when the official Go language server is released.

If you notice the memory usage does not drop down after a minute or two, that would indicate a leak and a bug we could fix, however.

@doxxx
Copy link

doxxx commented Nov 26, 2018

I just had another occurrence where the go-langserver processor is at ~20GB, with ~100-300MB/s disk IO and ~100% CPU for about 10 minutes so far.

Here's the heap and CPU graphs:
heap_cpu_2.zip

It seems to be the loader package still, although the second dump appears to be involving the build package as well.

Is there nothing that can be done about this in the interim?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants