-
-
Notifications
You must be signed in to change notification settings - Fork 296
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Too many workers #43
Comments
Sorry, I realized I wasn't too clear. So, I'm running python manage.py qcluster. Checking the processes, it has five processes running, each using around 90mb. Right now, I'm only using django-q for sending out mails in the background and it doesn't matter if it's fast or not, I primary need to reduce the memory footprint. I hope this makes sense! Cheers! |
What you are seeing are not workers, but 4 auxiliary processes + 1 worker. |
Sure! Here's the result:
Edit: These results are from a freshly restarted process |
Ok, this is why linux memory management is so confusing. Even though it reports around 90Mb per process, these are all forked child processes and do in fact use the same memory space through copy-on-write. The actual memory it consumes will more likely be around 100Mb total. Check how much memory is freed up when you stop the cluster. |
Another thing you can do, is install |
Some background info on what to expect: When the cluster is started it will use about the same memory as your Django project, since it is just a copy of it and all the processes (still) share the same memory. There will be a little overhead from spawning the workers. Usually about +5 %. It looks like you have a quite a large Django project (around 85Mb). Realistically, with 2-4 workers and quite a lot of emails, your memory footprint should not exceed 150Mb over time, if you set the The highest memory usage I've seen during testing, was about 260Mb for an 8 worker cluster, with a recycle of 1000. This was after a 100.000 tasks complex math load test heating up my cpu's. |
Hi again, sorry for the delay (sitting on a desolated island in Thailand right now, so it's the complete wrong timezone :)) Thank you very much! I look forward following this project!
|
Maybe you should look into opening a support ticket with Webfaction. I host most of my projects on Heroku, which has the same 512Mb limit. My largest project is about 75Mb and 4 workers don't use more than about 125Mb according to Heroku's own memory reporter. Even my 4 worker Gunicorn setup uses only 150Mb. That said; I do run Django-Q on a separate machine. |
Hi again (sorry for bothering you!)
I just pushed django-q to my external server, and saw that with the configuration set to 1 worker, django-q launches five processes. How can I reduce this, since it's a shared server with limited memory?
Thank you!
The text was updated successfully, but these errors were encountered: