Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High RAM usage when proxying WebDAV (& possibly normal traffic as well) #1310

Closed
coderobe opened this issue Dec 27, 2016 · 20 comments · Fixed by #1314
Closed

High RAM usage when proxying WebDAV (& possibly normal traffic as well) #1310

coderobe opened this issue Dec 27, 2016 · 20 comments · Fixed by #1314
Labels
bug 🐞 Something isn't working

Comments

@coderobe
Copy link

1. What version of Caddy are you running (caddy -version)?

Caddy 0.9.4

2. What are you trying to do?

Proxy an entire (sub)domain through to another server (which mostly handles WebDAV)

3. What is your entire Caddyfile?

example.com {
  header / -Server
  log / /srv/www/log/access.log "{host}: {scheme} request by {remote}: {method} {proto} {path} {query}" {
    rotate {
      size 256
      age 14
      keep 3
    }
  }
  errors {
    log /srv/www/log/error.log {
      size 256
      age 14
      keep 3
    }
  }
  gzip
  root /srv/www/dummy

  proxy / 10.10.10.10:81 {
    header_upstream Host {host}
    header_upstream X-Real-IP {remote}
    header_upstream X-Forwarded-For {remote}
    header_upstream X-Forwarded-Proto {scheme}
    header_upstream X-Frame-Options SAMEORIGIN
  }
}

4. How did you run Caddy (give the full command and describe the execution environment)?

I'm using systemd to start Caddy & the command the systemd unit executes is
/usr/bin/caddy -log stdout -agree=true -conf=/srv/www/Caddyfile -root=/var/tmp -email hostmaster@example.com -quic

5. What did you expect to see?

Caddy serving/proxying the service as usual

6. What did you see instead (give full error messages and/or log)?

High RAM usage
In Idle state, caddy uses about 100M of RAM,
After starting a big WebDAV operation that was going through the proxy it jumped to about 1G,
10 minutes later it was at 2G and shortly after the (successfully) proxied operation finished the memory usage went back down to about 100M again.

7. How can someone who is starting from scratch reproduce this behavior as minimally as possible?

Although this can probably be reproduced easier, my setup was:
nginx serving a (fresh) Nextcloud instance on 10.10.10.10:81 (lan) (http (no TLS)),
Caddy proxying the connection as shown in the Caddyfile,
Then i mounted the Nextcloud instance locally using WebDAV & pushed about 50G through (with a gbit uplink)

IIRC the memory usage started jumping high after a few gigs already

@coderobe
Copy link
Author

@lhecker
Copy link

lhecker commented Dec 29, 2016

@coderobe I'm not familiar with WebDAV but I guess those 50GB of data you tried to proxy where within a single request right?
Well I believe I have a fix for this and will soon open a PR. Using my patch I was able to proxy a 10GB request, which took about 40s, while the memory usage stayed at 5MB (not a typo). 🙂
(You'll probably get a far higher performance as well as baseline memory usage tho.)

Edit: See #1314

@elcore
Copy link
Collaborator

elcore commented Dec 29, 2016

@lhecker this sounds amazing 👍

@coderobe
Copy link
Author

@lhecker Nice work! However i'm not sure how the davfs driver on linux bundles requests, if at all - and i think it does one request per file - my issue appeared with lots of files from 1 to 50 MB in size each

@jcgruenhage
Copy link

The 2GB for a 50GB file transfer are cute, for me, caddy goes up 14GB RAM usage and crashes, after 8.6GB of a 11.2GB file. Not really what I want :D

@mholt
Copy link
Member

mholt commented Dec 31, 2016

@jcgruenhage @coderobe Do you have the ability to build from source? Would love it if you could try the PR linked above and see if it resolves the issue.

@jcgruenhage
Copy link

I should be able to figure that out, I'll see when I find time to try that.

@jcgruenhage
Copy link

go get github.com/lhecker/caddy/caddy should be enough, right? Because with that, I still get my 14GB RAM usage..

@mholt
Copy link
Member

mholt commented Jan 1, 2017 via email

@jcgruenhage
Copy link

after checking out the branch, go get does not work anymore (with a completely empty gopath except for that one repo):

package github.com/mholt/caddy/caddy/https: cannot find package "github.com/mholt/caddy/caddy/https" in any of:
	/usr/lib/golang/src/github.com/mholt/caddy/caddy/https (from $GOROOT)
	/home/jcgruenhage/dev/gopath/src/github.com/mholt/caddy/caddy/https (from $GOPATH)
package github.com/mholt/caddy/caddy/parse: cannot find package "github.com/mholt/caddy/caddy/parse" in any of:
	/usr/lib/golang/src/github.com/mholt/caddy/caddy/parse (from $GOROOT)
	/home/jcgruenhage/dev/gopath/src/github.com/mholt/caddy/caddy/parse (from $GOPATH)
package github.com/mholt/caddy/caddy/setup: cannot find package "github.com/mholt/caddy/caddy/setup" in any of:
	/usr/lib/golang/src/github.com/mholt/caddy/caddy/setup (from $GOROOT)
	/home/jcgruenhage/dev/gopath/src/github.com/mholt/caddy/caddy/setup (from $GOPATH)
package github.com/mholt/caddy/middleware: cannot find package "github.com/mholt/caddy/middleware" in any of:
	/usr/lib/golang/src/github.com/mholt/caddy/middleware (from $GOROOT)
	/home/jcgruenhage/dev/gopath/src/github.com/mholt/caddy/middleware (from $GOPATH)
package github.com/mholt/caddy/server: cannot find package "github.com/mholt/caddy/server" in any of:
	/usr/lib/golang/src/github.com/mholt/caddy/server (from $GOROOT)
	/home/jcgruenhage/dev/gopath/src/github.com/mholt/caddy/server (from $GOPATH)

As soon as I switch back to master, it works (go get, not getting the binary from the source code of the PR) again. Am I doing something wrong? Could you just send me a linux x64 binary?

@mholt
Copy link
Member

mholt commented Jan 1, 2017

After you check out the branch, you should not need to run go get - once you've run go get -u on master, check out the unbuffered_proxy branch with git. Then go run main.go to run Caddy. If that's still having trouble, I can send you a binary.

@jcgruenhage
Copy link

ahhh my bad I was on dynamic-proxy and not unbuffered_proxy. No wonder it did not work :D

@jcgruenhage
Copy link

jcgruenhage commented Jan 1, 2017

Okay, I still can not complete the upload, but only because there is another caddy instance I can not update (I tried to copy the binary into the docker container, but it won't start up afterwards) in between the uploading computer and nextcloud. The local caddy proxy stays at 0.1% RAM usage.

@mholt
Copy link
Member

mholt commented Jan 1, 2017

I don't know about Docker but maybe setting the env var CGO_ENABLED=0 and re-compiling will help? 🤷‍♂️ Anyway, it sounds like it has helped though?

@jcgruenhage
Copy link

Yes, it helped :)
The docker container told me it could not find the binary after putting it in /usr/bin/caddy (thats where the old binary was too). I could just switch to another docker image, that builds caddy from source, that would fix that problem completely for me ^^

@jcgruenhage
Copy link

When I add ENV CGO_ENABLED 0 to my Dockerfile, caddy doesn't seem to compile.. It does not print out any errors, but there is no $GOPATH/bin folder after running go get.. without the environment variable, I get errors about cgo not being there.. any idea?

@mholt
Copy link
Member

mholt commented Jan 2, 2017

Not sure -- but you'd want that env variable set when you compile Caddy, not when you run it.

@jcgruenhage
Copy link

installing build-base was enough for caddy to successfully build in the container, if go needs that, I wonder why it is not a dependency..

@jcgruenhage
Copy link

jcgruenhage commented Jan 3, 2017

Just switched to the new container, I can now confirm, that caddy uses just about 9.4 MB of RAM during the transfer. Thanks for the PR @lhecker !

Edit: should anyone be interested in the container, you can find the it under https://hub.docker.com/r/jcgruenhage/docker-source-caddy/ with the tag unbuffered_proxy, latest and v0.9.4 will use v0.9.4, and I will update latest to newer caddy releases when they come out. While there are containers that are smaller, I think 50MB (20MB when compressed) is small enough :D

@lhecker
Copy link

lhecker commented Jan 4, 2017

Glad to hear it's working for you as well, @jcgruenhage!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug 🐞 Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants