-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Prevent concurrent identical outgoing requests #5474
Comments
The other feature requests around improving
If the few requests sent before a cache entry is stored cause high load on the server, surely something is configured incorrectly? 🤔 |
This may be very theoretical, but I am thinking of scenarios with many users with high RPS. We perform an outgoing request to retrieve the user's access information, so my idea was that by enabling caching we can effectively limit the RPS-per-user for OPA requests. For instance if a user is making 10 RPS and our cache lifetime is 1 second, there is only 1 RPS between OPA and the service containing user's access information. Likewise, if we have 10 users with 10 RPS each that would mean 100 RPS against OPA but only 10 RPS against the service containing user's access information, if the cache worked as described in this feature request. Currently however there is no such guarantee, 10 RPS against OPA may very well mean 10 RPS against the service containing user's access information.
If that's the case, that's totally fair 🙂 |
It does sound a bit theoretical :) Philosophically, it seems ideal to me to have any request be as isolated from others as we possbly can make them, and having one policy evaluation needing to "wait" for another seems like it could cause all sorts of unexpected conditions 😅 So while I appreciate the idea, I think we might want to close this for now, and perhaps revisit it once the more pressing issues with |
I guess 🤷 |
What is the underlying problem you're trying to solve?
If an OPA rule using
http.send
is evaluated several times concurrently, and the interquery cache does not contain a valid response, OPA will perform several concurrent outgoing HTTP requests to the target web server. This can cause unnecessarily high load on the target webserver, partially defeating the purpose of caching if enabled.Describe the ideal solution
If OPA wants to perform
http.send
and detects that an identical outgoing request is currently being performed, it should not perform a new concurrenthttp.send
but rather wait for the current outgoing request to complete and reuse its response.Describe a "Good Enough" solution
Additional Context
This proposal was originally discussed in #5359 as a potential fix, and it was attempted implemented in an MR #5374. The MR was closed in favor of a quicker fix. I think this is still a nice-to-have feature so I'm opening this issue so that it's at least tracked
I can also note that this feature could help in mitigating the problem described in #5381, specifically here, as it could lessen the number of keys in the cache
The text was updated successfully, but these errors were encountered: