-
Notifications
You must be signed in to change notification settings - Fork 103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Need advice on Solid Queue's memory usage #330
Comments
Hey @yjchieng, thanks for opening this issue! 🙏 I think it depends a lot on your app. A brand new Rails app seems to use around 74.6MB memory for me after booting (without Solid Queue, just running Puma). I think the consumption you're seeing is from all the processes together and not just the supervisor, measuring free memory before starting the supervisor and after, as the supervisor will fork more processes. Are you running multiple workers or just one? I think reducing the number of workers there would help. Another thing that might help is using |
There might also be something else going on because the only changes from version 0.7.0 to 0.8.2 were for the installing part of Solid Queue; nothing was changed besides the initial installation, so the memory footprint shouldn't have changed. I imagine there is other stuff going on in your AWS instance at the same time that might consuming memory as well. |
Ruby: 3.3.4
Rails: 7.2.1
Solid Queue: 0.7.0, 0.8.2
I run a Rails App on AWS EC2 instance with 1G of memory.
I notice the solid queue process takes up 15-20% of the instance's memory, which becomes the single largest process by memory usage.
What I checked:
(I use it to manage my solid queue process)
stop supervisorctl - free memory 276MB
start supervisorctl - free memory 117MB
Trying to see if this is something related to supervisor.
before solid_queue:start - free memory 252MB
after solid_queue:start - free memory 109MB
I upgraded to 0.8.2 (was 0.7.0).
stop supervisorctl - free memory 220MB
start supervisorctl - free memory 38MB
I need some advise:
And, thanks a lot for making this wonderful gem. :)
The text was updated successfully, but these errors were encountered: