-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
request help: Will limit-req support nodelay and rate in minute #4382
Comments
PR is welcome!
Divide the rate by 60 is already enough? IMHO, there is no need to add a field. Since Nginx also does the same way: https://github.com/nginx/nginx/blob/5eadaf69e394c030056e4190d86dae0262f8617c/src/http/modules/ngx_http_limit_req_module.c#L913 |
In my company, many users set 200 r/m on remote_address or even lower, If it divided by 60, 1r/s or 2r/s will be reached easily, because modern browser support http2, so one user visit one nignx over 1r/s is very common. Now, we transfer this conf to apisix, some of them reach limit and delay request online。 I think we use nodelay in nginx conf while apisix only support delay cause this happen, because both of them reach the rate limit, but not over burst. I will test it
Maybe multiply by 1000 is the key to solve the decimal point ? |
It's how Nginx works too. Note that the r/m is the speed rate, it doesn't mean you can have 200 requests in a minute, it means you can't be faster than 200r/m. Drive a car in 36km/h is equal to 10m/s.
In this case we need a good burst and/or use delay.
|
Thanks for reply, after adding nodelay, we solve the online problem
So mayby rate should be float, but I think it is not the key point
I want to submit a pr to add nodelay function, a new param as follow
Is it ok? |
LGTM |
Issue description
Will limit-req support nodelay and rate in minute?
If not, I find nodelay can be implemented by adding condition, but resty.limit.req only support rate in second, how can I implemnt rate in minute?
Environment
Request help without environment information will be ignored or closed.
apisix version
): 2.5uname -a
): ubuntunginx -V
oropenresty -V
):curl http://127.0.0.1:9090/v1/server_info
to get the info from server-info API):luarocks --version
):The text was updated successfully, but these errors were encountered: