Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

基于请求延迟的并发控制 #2013

Closed
yanglimingcn opened this issue Nov 24, 2022 · 7 comments
Closed

基于请求延迟的并发控制 #2013

yanglimingcn opened this issue Nov 24, 2022 · 7 comments
Labels
feature new feature

Comments

@yanglimingcn
Copy link
Contributor

brpc有const concurrenty limiter 和 auto concurrency limiter,用于限制服务的并发度,const concurrency limiter需要实际对服务压测配置,运营起来比较繁琐。auto concurrency limiter比较灵活,基于little’s low法则,但是在使用过程中发现会经常报ELIMIT。
在服务正常运营过程中,流量的增减,请求体的大小这些会有变化,涉及到磁盘请求,磁盘的响应会有波动,造成auto concurrency limiter算法得到的最大并发、最小延迟的波动比较大,所以会经常报ELIMIT。
基于此,提出一个想法,是否可以基于用户对请求的最大延迟要求max_latency,做一个latency concurrency limiter,用concurrency * avg_latency得到请求的预期处理时间,超过用户要求的max_latency就报ELIMIT。用户对请求延迟的要求一般都会比服务的min_latency要高很多,所以这个算法就不会造成太多的报错,也能比较容易运营。
@wwbmmm 辛苦看一下,这样的算法是否合理?

@wwbmmm wwbmmm added the feature new feature label Nov 24, 2022
@wwbmmm
Copy link
Contributor

wwbmmm commented Nov 24, 2022

我觉得可以

@yanglimingcn
Copy link
Contributor Author

那我这边提一个PR吧,到时候你来看看。

@yanglimingcn
Copy link
Contributor Author

#2027 @wwbmmm 我提交了一个PR,还没有写单测,你先大概Review以下,看看是否合理。其实都是基于auto concurrency limiter修改的。

@chenBright
Copy link
Contributor

@yanglimingcn 能写个关于这个特性的文档吗?

@yanglimingcn
Copy link
Contributor Author

OK,等我有时间补充一下哈

@yanglimingcn
Copy link
Contributor Author

#2091

@serverglen
Copy link
Contributor

closed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature new feature
Projects
None yet
Development

No branches or pull requests

4 participants