-
Notifications
You must be signed in to change notification settings - Fork 3.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
About performance of p99 latency in 200 concurrencies. #859
Comments
I will take a look, likely next week. |
@spencergibb , |
@spencergibb Did you get a chance to look at this issue ? |
No, I have not. I have plans to do it this week. |
@spencergibb |
@kimmking @spencergibb Hi ! Here is the project for testing My hardware
benchmark with
|
@spencergibb @violetagg It seems that the gateway will leak with the example from @goudai |
@goudai Hi,Through many tests, the delay of spring-cloud-gateway is still unstable. use test project: https://github.com/goudai/gateway-performance-test My hardware MacBook Pro (Retina, 15-inch, Mid 2015) Test response with 100 characters#1 direct to app
wrk -t16 -c200 -d30s "http://127.0.0.1:8081/demo?delay=50&length=128" --latency
Running 30s test @ http://127.0.0.1:8081/demo?delay=50&length=128
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 54.59ms 1.59ms 78.60ms 71.85%
Req/Sec 220.13 27.08 242.00 84.50%
Latency Distribution
50% 54.70ms
75% 55.56ms
90% 56.43ms
99% 57.99ms
105359 requests in 30.10s, 27.03MB read
Socket errors: connect 0, read 37, write 0, timeout 0
Requests/sec: 3500.33
Transfer/sec: 0.90MB
#direct to app
wrk -t16 -c200 -d5m "http://127.0.0.1:8081/demo?delay=50&length=128" --latency
Running 5m test @ http://127.0.0.1:8081/demo?delay=50&length=128
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 54.68ms 1.60ms 98.71ms 74.49%
Req/Sec 220.05 25.81 247.00 84.74%
Latency Distribution
50% 54.80ms
75% 55.61ms
90% 56.39ms
99% 58.11ms
1052161 requests in 5.00m, 269.92MB read
Socket errors: connect 0, read 38, write 0, timeout 0
Requests/sec: 3506.12
Transfer/sec: 0.90MB
#gateway
wrk -t16 -c200 -d30s "http://127.0.0.1:8082/proxy/demo?delay=50&length=128" --latency
Running 30s test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=128
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 74.59ms 90.22ms 1.38s 97.58%
Req/Sec 190.68 39.24 242.00 72.58%
Latency Distribution
50% 58.35ms
75% 63.88ms
90% 78.12ms
99% 606.97ms
89132 requests in 30.10s, 20.83MB read
Socket errors: connect 0, read 143, write 0, timeout 0
Requests/sec: 2961.57
Transfer/sec: 708.58KB
#gateway
wrk -t16 -c200 -d30s "http://127.0.0.1:8082/proxy/demo?delay=50&length=128" --latency
Running 30s test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=128
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 58.11ms 4.06ms 114.47ms 79.32%
Req/Sec 207.03 24.73 242.00 66.33%
Latency Distribution
50% 57.14ms
75% 59.73ms
90% 63.13ms
99% 72.11ms
99198 requests in 30.10s, 23.18MB read
Socket errors: connect 0, read 58, write 0, timeout 0
Requests/sec: 3295.34
Transfer/sec: 788.45KB
#gateway
wrk -t16 -c200 -d5m "http://127.0.0.1:8082/proxy/demo?delay=50&length=128" --latency
Running 5m test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=128
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 59.35ms 12.50ms 537.66ms 96.51%
Req/Sec 203.71 28.72 242.00 77.11%
Latency Distribution
50% 57.19ms
75% 60.04ms
90% 64.25ms
99% 95.41ms
973145 requests in 5.00m, 227.38MB read
Socket errors: connect 0, read 59, write 0, timeout 70
Requests/sec: 3242.90
Transfer/sec: 775.89KB
#gateway
wrk -t16 -c200 -d5m "http://127.0.0.1:8082/proxy/demo?delay=50&length=128" --latency
Running 5m test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=128
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 58.74ms 12.98ms 1.76s 97.28%
Req/Sec 204.13 29.73 242.00 77.03%
Latency Distribution
50% 57.12ms
75% 59.90ms
90% 63.79ms
99% 84.39ms
974762 requests in 5.00m, 227.75MB read
Socket errors: connect 0, read 43, write 0, timeout 156
Requests/sec: 3248.28
Transfer/sec: 777.18KB
Test response with 10240 characters# direct to app
wrk -t16 -c200 -d30s "http://127.0.0.1:8081/demo?delay=50&length=10240" --latency
Running 30s test @ http://127.0.0.1:8081/demo?delay=50&length=10240
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 55.31ms 2.05ms 104.88ms 83.04%
Req/Sec 217.39 25.49 242.00 83.81%
Latency Distribution
50% 55.15ms
75% 56.07ms
90% 57.17ms
99% 60.46ms
104115 requests in 30.08s, 1.01GB read
Socket errors: connect 0, read 38, write 0, timeout 0
Requests/sec: 3460.73
Transfer/sec: 34.27MB
# direct to app
wrk -t16 -c200 -d5m "http://127.0.0.1:8081/demo?delay=50&length=10240" --latency
Running 5m test @ http://127.0.0.1:8081/demo?delay=50&length=10240
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 55.14ms 1.91ms 119.59ms 77.50%
Req/Sec 218.22 26.11 242.00 83.60%
Latency Distribution
50% 55.10ms
75% 56.05ms
90% 57.12ms
99% 59.91ms
1043460 requests in 5.00m, 10.09GB read
Socket errors: connect 0, read 45, write 0, timeout 0
Requests/sec: 3477.11
Transfer/sec: 34.43MB
# gateway
wrk -t16 -c200 -d30s "http://127.0.0.1:8082/proxy/demo?delay=50&length=10240" --latency
Running 30s test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=10240
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 99.65ms 109.23ms 1.99s 93.48%
Req/Sec 140.79 58.62 242.00 63.66%
Latency Distribution
50% 66.85ms
75% 95.96ms
90% 153.18ms
99% 561.19ms
66080 requests in 30.10s, 652.81MB read
Socket errors: connect 0, read 121, write 0, timeout 85
Requests/sec: 2195.41
Transfer/sec: 21.69MB
# gateway
wrk -t16 -c200 -d30s "http://127.0.0.1:8082/proxy/demo?delay=50&length=10240" --latency
Running 30s test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=10240
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 69.09ms 27.97ms 545.24ms 92.72%
Req/Sec 178.36 43.57 242.00 71.72%
Latency Distribution
50% 60.37ms
75% 70.52ms
90% 88.61ms
99% 196.79ms
85374 requests in 30.10s, 843.42MB read
Socket errors: connect 0, read 92, write 0, timeout 0
Requests/sec: 2836.10
Transfer/sec: 28.02MB
# gateway
wrk -t16 -c200 -d5m "http://127.0.0.1:8082/proxy/demo?delay=50&length=10240" --latency
Running 5m test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=10240
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 72.68ms 49.23ms 1.09s 92.74%
Req/Sec 176.98 56.03 242.00 76.59%
Latency Distribution
50% 58.62ms
75% 64.59ms
90% 99.53ms
99% 303.34ms
823758 requests in 6.21m, 7.95GB read
Socket errors: connect 0, read 107, write 0, timeout 576
Requests/sec: 2211.20
Transfer/sec: 21.84MB
# gateway
wrk -t16 -c200 -d5m "http://127.0.0.1:8082/proxy/demo?delay=50&length=10240" --latency
Running 5m test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=10240
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 62.00ms 33.30ms 964.33ms 98.08%
Req/Sec 201.18 31.18 242.00 80.03%
Latency Distribution
50% 57.53ms
75% 60.72ms
90% 66.01ms
99% 136.52ms
958755 requests in 5.00m, 9.25GB read
Socket errors: connect 0, read 31, write 0, timeout 0
Requests/sec: 3194.86
Transfer/sec: 31.56MB
Test response with 40960 characters# direct to app
wrk -t16 -c200 -d30s "http://127.0.0.1:8081/demo?delay=50&length=40960" --latency
Running 30s test @ http://127.0.0.1:8081/demo?delay=50&length=40960
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 57.85ms 3.53ms 91.82ms 74.98%
Req/Sec 207.57 34.03 242.00 77.41%
Latency Distribution
50% 57.32ms
75% 59.85ms
90% 62.23ms
99% 67.67ms
99419 requests in 30.11s, 3.81GB read
Socket errors: connect 0, read 39, write 0, timeout 0
Requests/sec: 3302.33
Transfer/sec: 129.45MB
wrk -t16 -c200 -d5m "http://127.0.0.1:8081/demo?delay=50&length=40960" --latency
Running 5m test @ http://127.0.0.1:8081/demo?delay=50&length=40960
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 58.37ms 4.10ms 179.13ms 73.23%
Req/Sec 205.84 36.87 242.00 81.58%
Latency Distribution
50% 57.67ms
75% 60.81ms
90% 63.60ms
99% 69.02ms
983823 requests in 5.00m, 37.66GB read
Socket errors: connect 0, read 48, write 0, timeout 0
Requests/sec: 3278.49
Transfer/sec: 128.51MB
# gateway
wrk -t16 -c200 -d30s "http://127.0.0.1:8082/proxy/demo?delay=50&length=40960" --latency
Running 30s test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=40960
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 89.95ms 40.98ms 448.80ms 87.95%
Req/Sec 136.97 47.68 242.00 69.39%
Latency Distribution
50% 75.90ms
75% 101.52ms
90% 139.18ms
99% 249.04ms
65195 requests in 30.05s, 2.49GB read
Socket errors: connect 0, read 111, write 0, timeout 0
Requests/sec: 2169.79
Transfer/sec: 85.00MB
wrk -t16 -c200 -d30s "http://127.0.0.1:8082/proxy/demo?delay=50&length=40960" --latency
Running 30s test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=40960
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 68.18ms 23.31ms 410.55ms 93.27%
Req/Sec 179.50 41.13 242.00 71.75%
Latency Distribution
50% 61.88ms
75% 69.95ms
90% 82.59ms
99% 180.85ms
85936 requests in 30.10s, 3.29GB read
Socket errors: connect 0, read 40, write 0, timeout 0
Requests/sec: 2854.96
Transfer/sec: 111.85MB
wrk -t16 -c200 -d5m "http://127.0.0.1:8082/proxy/demo?delay=50&length=40960" --latency
Running 5m test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=40960
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 67.48ms 20.86ms 678.85ms 93.38%
Req/Sec 180.58 37.92 242.00 65.77%
Latency Distribution
50% 62.44ms
75% 70.37ms
90% 81.59ms
99% 145.01ms
862514 requests in 5.00m, 33.00GB read
Socket errors: connect 0, read 159, write 0, timeout 0
Requests/sec: 2874.14
Transfer/sec: 112.60MB
wrk -t16 -c200 -d5m "http://127.0.0.1:8082/proxy/demo?delay=50&length=40960" --latency
Running 5m test @ http://127.0.0.1:8082/proxy/demo?delay=50&length=40960
16 threads and 200 connections
Thread Stats Avg Stdev Max +/- Stdev
Latency 72.50ms 34.44ms 1.03s 94.48%
Req/Sec 171.33 38.57 242.00 69.31%
Latency Distribution
50% 64.77ms
75% 74.72ms
90% 90.74ms
99% 198.72ms
815188 requests in 5.00m, 31.19GB read
Socket errors: connect 0, read 170, write 0, timeout 0
Requests/sec: 2716.46
Transfer/sec: 106.42MB
|
There's a fundamental flaw in proxy benchmarks where the driver, proxy and downstream service all run on the same machine competing for the same resources. Posting different results for the same project means something is different with the environment. |
Thanks,We have done a spring-cloud-gateway pressure test based on a specific environment, but it still shows that the 99percent latency is not stable. |
I wanted to help test this - but my environment (all pieces on different servers) is adding about 8-14ms in a best case scenario (no proxy). Actually runs faster locally. So I'm not sure how useful the test will be. |
@dave-fl don't worry about a few ms. The point would be the claim that the 99th percentile is an order of magnitude worse. |
I am using a different back end to service but it seems to be stable. I get slightly better numbers locally. I have wrk on container 1, gateway on container 2, backend on container 3. Normally for my tests I use https but I don't want to complicate things here so I am just doing pure http on the clouds network. I am including multiple gateway tests, you can see how the numbers fluctuate (jvm warm up, load, etc)
|
This comment has been minimized.
This comment has been minimized.
I'm going to go ahead and close this. |
|
@spencergibb My testing could be flawed - not enough load. I re-tested with Reactor Netty and Webflux and when the connections increased the distribution of the latencies started to show their colors. More info here. |
@spencergibb Project reactor-netty has a huge improvement in branch, and looks like ship a new release soon.and then I am look forwarding a new version of gateway. |
@spencergibb hi, spencer, can you retry your wrk test command with option
--latency
?I find 99percent latency is 2-3 times than the case of direct access with
-c200
.ref: #301
The text was updated successfully, but these errors were encountered: