Skip to content

Commit

Permalink
docs: rewrote increasing request rate page
Browse files Browse the repository at this point in the history
  • Loading branch information
cyberw committed Oct 24, 2024
1 parent 5e800de commit 130835c
Showing 1 changed file with 24 additions and 11 deletions.
35 changes: 24 additions & 11 deletions docs/increasing-request-rate.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,25 +4,38 @@
Increasing the request rate
===========================

Increase the number of requests per second using a combination of the following steps:
If you're not getting the desired/expected throughput there are a number of things you can do.

#. Increase the number of users. To fully utilize your target system you may need a lot of simultaneous users, especially if each request takes a long time to complete.
Concurrency
-----------

#. If response times are unexpectedly high and/or increasing as the number of users go up, then you have probably saturated the system you are testing and need to dig into why. This is not really a Locust problem, but here are some things you may want to check:
Increase the number of Users. To fully utilize your target system you may need a lot of concurrent requests. Note that spawn rate/ramp up does not change peak load, it only changes how fast you get there. `Wait times <writing-a-locustfile.html#wait-time>`_ and sleeps *do* impact throughput. You can find a whole blog post on this topic `here <https://www.locust.cloud/blog/closed-vs-open-workload-models>`__.

- resource utilization (e.g. CPU, memory & network. Check these metrics on the locust side as well)
- configuration (e.g. max threads for your web server)
- back end response times (e.g. DB)
- client side DNS performance/flood protection (Locust will normally make at least one DNS Request per User)
Load generation performance
---------------------------

#. If Locust prints a warning about high CPU usage (``WARNING/root: CPU usage above 90%! ...``) try the following:
If Locust prints a warning about high CPU usage (``WARNING/root: CPU usage above 90%! ...``) try the following:

- Run Locust `distributed <https://docs.locust.io/en/stable/running-locust-distributed.html>`__ to utilize multiple cores & multiple machines
- Try switching to `FastHttpUser <https://docs.locust.io/en/stable/increase-performance.html#increase-performance>`__ to reduce CPU usage
- Check to see that there are no strange/infinite loops in your code

#. If you are using a custom client (not HttpUser or FastHttpUser), make sure any client library you are using is gevent-friendly otherwise it will block the entire Python process (essentially limiting you to one user per worker)
Also, if you are using a custom client (not HttpUser or FastHttpUser), make sure any client library you are using is `gevent-friendly <https://www.gevent.org/api/gevent.monkey.html>`__ otherwise it will block the entire Python process (essentially limiting you to one user per worker)

.. note::
If you're doing really high throughput or using a lot of bandwidth, you may also want to check out your network utilization and other OS level metrics.

Hatch rate/ramp up does not change peak load, it only changes how fast you get there.
If you have issues with load generator performance and would rather pay to make it someone else's problem, you should check out `Locust Cloud <https://locust.cloud/>`__.

Actual issues with the system under test
----------------------------------------

If response times are high and/or increasing as the number of users go up, then you have probably saturated the system you are testing. This is not a Locust problem, but here are some things you may want to check:

- resource utilization (e.g. CPU, memory & network)
- configuration (e.g. max threads for your web server)
- back end response times (e.g. DB)

There are a few common pitfalls specific to load testing too:

- load balancing (make sure locust isn't hitting only a few of your instances)
- flood protection (sometimes a load test with the high amount of load from only a few machines will trigger this)

0 comments on commit 130835c

Please sign in to comment.