You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We have multiple channels. Some of them have from 500 to 1000 subscribers, others have many less.
Thus some subscribers of highloaded channels suffer from delays.
I wonder whether there are some way to balance CPU capabilities not only between channels, but between subscribers to have equal number of subscribers distributed among each CPU?
Maybe there's some kind of configuration to speed up delivery?
Thanks!
The text was updated successfully, but these errors were encountered:
Hi @yaroslav-phokus inside the module we don't have control of in which process the user was connected to, this is something lower level.
On top of my knowledge, what you can try to improve is the affinity between the Nginx processes and the cores, and then the distribution of connections per process, something like the accept_mutex
These should improve the distribution optimizing the deliver.
We have multiple channels. Some of them have from 500 to 1000 subscribers, others have many less.
Thus some subscribers of highloaded channels suffer from delays.
I wonder whether there are some way to balance CPU capabilities not only between channels, but between subscribers to have equal number of subscribers distributed among each CPU?
Maybe there's some kind of configuration to speed up delivery?
Thanks!
The text was updated successfully, but these errors were encountered: