Skip to content
This repository has been archived by the owner on Oct 11, 2024. It is now read-only.

Investigate apparent memory leak #452

Closed
albrow opened this issue Oct 16, 2019 · 1 comment
Closed

Investigate apparent memory leak #452

albrow opened this issue Oct 16, 2019 · 1 comment
Assignees
Labels
performance Related to improving or measuring performance

Comments

@albrow
Copy link
Contributor

albrow commented Oct 16, 2019

I've been closely monitoring memory usage over the past week or so and it seems like Mesh has a memory leak.

Screen Shot 2019-10-16 at 11 44 42 AM

We can see by looking at the logs that the sharp drops in memory usage coincide with Mesh restarting. Most likely what is happening is that the OS is killing the Mesh process and then it automatically restarts due to restart: always in the docker-compose.yml file.

@albrow albrow added the performance Related to improving or measuring performance label Oct 16, 2019
@albrow albrow self-assigned this Oct 16, 2019
@albrow albrow mentioned this issue Oct 16, 2019
@albrow
Copy link
Contributor Author

albrow commented Nov 28, 2019

After 9 days of monitoring a Mesh node running uninterrupted, we can conclude that #539 did in fact fix the issue.

Here's a chart of memory usage over the past 7 days:

Screen Shot 2019-11-27 at 4 24 41 PM

And a recent snapshot obtained via go tool pprof:

File: mesh
Type: inuse_space
Time: Nov 27, 2019 at 2:23am (PST)
Showing nodes accounting for 74.21MB, 100% of 74.21MB total
      flat  flat%   sum%        cum   cum%
   13.41MB 18.07% 18.07%    13.41MB 18.07%  github.com/0xProject/0x-mesh/vendor/github.com/syndtr/goleveldb/leveldb/util.(*BufferPool).Get
   12.81MB 17.26% 35.33%    13.87MB 18.70%  github.com/0xProject/0x-mesh/vendor/github.com/gogo/protobuf/io.(*varintWriter).WriteMsg
   12.67MB 17.07% 52.40%    12.67MB 17.07%  github.com/0xProject/0x-mesh/vendor/github.com/gogo/protobuf/io.(*varintReader).ReadMsg
      12MB 16.18% 68.58%       12MB 16.18%  github.com/0xProject/0x-mesh/vendor/github.com/syndtr/goleveldb/leveldb/memdb.New
   10.08MB 13.58% 82.16%    11.08MB 14.93%  github.com/0xProject/0x-mesh/vendor/github.com/whyrusleeping/timecache.(*TimeCache).Add
    1.58MB  2.13% 84.29%     1.58MB  2.13%  net/http.(*http2ClientConn).frameScratchBuffer
    1.10MB  1.48% 85.77%     1.10MB  1.48%  github.com/0xProject/0x-mesh/vendor/github.com/syndtr/goleveldb/leveldb/memdb.(*DB).Put
    1.06MB  1.43% 87.20%     1.06MB  1.43%  github.com/0xProject/0x-mesh/vendor/github.com/libp2p/go-buffer-pool.(*BufferPool).Get
       1MB  1.35% 88.55%        1MB  1.35%  reflect.unsafe_NewArray
       1MB  1.35% 89.89%        1MB  1.35%  runtime.malg

The top handful of items never grow more than 12-16MB which is expected based on the configured size for various buffers.

@albrow albrow closed this as completed Nov 28, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
performance Related to improving or measuring performance
Projects
None yet
Development

No branches or pull requests

1 participant