-
-
Notifications
You must be signed in to change notification settings - Fork 894
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Multiple uWSGI workers #535
Comments
@EndenDragon the way I work with multiple workers is by running several one-worker servers on different ports, all behind an nginx reverse-proxy configured with iphash load balancing which implements sticky sessions. I have never tried this, but it looks like uWSGI supports the iphash load balancing algorithm since version 2.1. This combined with the redis queue may allow you to run multiple uWSGI workers from a single uWSGI instance. Give it a try and report back if it works. |
@miguelgrinberg I do not know how to setup the iphash load balancing technique for uwsgi. Where are you seeing that uwsgi 2.1 support this feature? |
Hmm. The reference that I found is this comment: unbit/uwsgi#83. But I cannot find a single example that shows how to use it, and searching through the source code it appears that iphash is an option that is available for use with their fastrouter plugin, which is a load balancer. I think my method is going to give you a lot less problems. Start a bunch of single-worker uWSGI instances, each on its own port, then configure nginx to do the load balancing. |
Suppose i have n CPU cores. As you suggested the best approach would be to run multiple single-worker uWSGI instances(on different port). What you suggest to run n single workers or 2n+1 single-worker?. Will it work with context switching on CPU processors. |
There is no absolute rule about this, the number of workers you run depends on the load that you need to handle, and the amount of work each client generates on the server. If your server is going to be doing mostly I/O, then you can run multiple workers per CPU and everything will still work great. |
I am currently loading my uWSGI under 1 worker process. May I use the
processes
flag in uWSGI and serve my flask app under multiple workers? If so, do I have to do anything special in making FlaskSocketIO work with multiple workers? (I have a redis messaging queue going on)The text was updated successfully, but these errors were encountered: