-
Notifications
You must be signed in to change notification settings - Fork 4.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Serf Lua RPC Client #865
Comments
some info on arbitrary TCP connections; https://groups.google.com/forum/#!topic/openresty-en/88E7-1CIowM |
Doing this involves setting up an outgoing tcp connection to the serf agent running on the same host. Basics work; connecting to the agent and sending/receiving some simple commands. One tricky part is the MessagePack library, it might need some changes to support streaming messages. An issue that is harder to solve (at least for a proper solution) is the synchronisation between worker processes. If implemented in OpenResty worker processes, each worker will have an open TCP connection, so each worker process will also get all events. So for example, 3 worker processes all receive an invalidation event, and they in turn all three invalidate the in memory cache. That would result in 3 expensive DB requests, instead of only one. That is undesirable. Having only 1 worker setup the tcp connection to the serf agent seems dangerous to me as events might get lost if that worker process crashes. The issue can be reduced to: how to reduce multiple instances of the same incoming event (one for each worker) to a single one. Issues to solve;
|
I think this can be fixed with locks the same way we handle reports and cluster keep-alive pings. A possible implementation will look like a basic leader-election across the workers:
|
closing this in favour of #835 (remove dependencies) |
Create a Serf client in Lua that follows Serf RPC protocol: https://www.serfdom.io/docs/agent/rpc.html
The text was updated successfully, but these errors were encountered: