Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle expired links better to avoid false positives #562

Open
joshtrichards opened this issue Feb 19, 2024 · 8 comments
Open

Handle expired links better to avoid false positives #562

joshtrichards opened this issue Feb 19, 2024 · 8 comments

Comments

@joshtrichards
Copy link
Member

Currently you only need a handful of users who click on an expired link and their IP range gets on the naughty list and blocked real fast

Originally posted by @DanScharon in #4 (comment)

@joshtrichards
Copy link
Member Author

Related: nextcloud/server#42614

@DaphneMuller
Copy link

@Fenn-CS or @sorbaugh is there any update here? the bug is reproducible as communicated by Anupam. Ticket is missing the SLA soon (1 working day left)

@Fenn-CS
Copy link

Fenn-CS commented Aug 21, 2024

Being looked into currently. @DaphneMuller

@DaphneMuller
Copy link

@Fenn-CS please let me know a confirmation of in which release the fix will be available. We can then inform the customer. Thanks!

@Fenn-CS
Copy link

Fenn-CS commented Aug 22, 2024

I’ve been able to somewhat reproduce this issue, where I encounter a 429 "Too Many Requests" response.

Screenshot from 2024-08-22 14-59-42

However, I’m not entirely sure if this scenario mirrors the situation where tens or hundreds of users might be blocked, particularly when it’s a case of multiple users attempting to access the same URL repeatedly from within the same network.

In my reproduction case, the 429 error occurs when a single user (apparently as sometimes the IPs can be shared) repeatedly visits the URL within a short time frame. This suggests that the rate-limiting mechanism might be getting triggered even under normal usage conditions. It's important to note that this issue might not be directly related to sharing itself but rather a situation that is more like to be observed frequently in sharing due to how often shared links are revisited.

Given that shared links are more likely to be accessed multiple times, especially over time, should we consider making an exception or adjusting the rate-limiting rules specifically for these cases?

cc: @nickvergessen @come-nc @blizzz

@DanScharon
Copy link

I’ve been able to somewhat reproduce this issue, where I encounter a 429 "Too Many Requests" response.
[...]

However, I’m not entirely sure if this scenario mirrors the situation where tens or hundreds of users might be blocked, particularly when it’s a case of multiple users attempting to access the same URL repeatedly from within the same network.

Please test with IPv6. In case of IPv6, a single address from a /64 segment running into the bruteforce protection blocks the whole /64 segment (in our case: every wifi user on campus).

@nickvergessen
Copy link
Member

@sorbaugh
Copy link

sorbaugh commented Oct 7, 2024

Talking with @icewind1991 it seems the most pragmatic approach would be indeed to add a grace period for links that used to be valid and remove bruteforceprotection for those.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants