Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multiple uWSGI workers #535

Closed
EndenDragon opened this issue Sep 3, 2017 · 5 comments
Closed

Multiple uWSGI workers #535

EndenDragon opened this issue Sep 3, 2017 · 5 comments
Labels

Comments

@EndenDragon
Copy link

I am currently loading my uWSGI under 1 worker process. May I use the processes flag in uWSGI and serve my flask app under multiple workers? If so, do I have to do anything special in making FlaskSocketIO work with multiple workers? (I have a redis messaging queue going on)

@miguelgrinberg
Copy link
Owner

@EndenDragon the way I work with multiple workers is by running several one-worker servers on different ports, all behind an nginx reverse-proxy configured with iphash load balancing which implements sticky sessions.

I have never tried this, but it looks like uWSGI supports the iphash load balancing algorithm since version 2.1. This combined with the redis queue may allow you to run multiple uWSGI workers from a single uWSGI instance. Give it a try and report back if it works.

@EndenDragon
Copy link
Author

@miguelgrinberg I do not know how to setup the iphash load balancing technique for uwsgi. Where are you seeing that uwsgi 2.1 support this feature?

@miguelgrinberg
Copy link
Owner

Hmm. The reference that I found is this comment: unbit/uwsgi#83. But I cannot find a single example that shows how to use it, and searching through the source code it appears that iphash is an option that is available for use with their fastrouter plugin, which is a load balancer.

I think my method is going to give you a lot less problems. Start a bunch of single-worker uWSGI instances, each on its own port, then configure nginx to do the load balancing.

@raushanraj
Copy link

@miguelgrinberg

the way I work with multiple workers is by running several one-worker servers on different ports, all behind an nginx reverse-proxy configured with iphash load balancing which implements sticky sessions.

Suppose i have n CPU cores. As you suggested the best approach would be to run multiple single-worker uWSGI instances(on different port). What you suggest to run n single workers or 2n+1 single-worker?. Will it work with context switching on CPU processors.

@miguelgrinberg
Copy link
Owner

There is no absolute rule about this, the number of workers you run depends on the load that you need to handle, and the amount of work each client generates on the server. If your server is going to be doing mostly I/O, then you can run multiple workers per CPU and everything will still work great.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants