Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unclear how to interpret numbers #303

Closed
emiel opened this issue Jul 9, 2015 · 14 comments
Closed

Unclear how to interpret numbers #303

emiel opened this issue Jul 9, 2015 · 14 comments

Comments

@emiel
Copy link

emiel commented Jul 9, 2015

The stats displayed in the UI and console do not indicate the numbers unit. For median, average, min, max I'm assuming is the response time... Is it milliseconds? How about content size. Is it in bytes?

These should be in the documentation and also displayed in the tooling.

Thanks! Great tooling so far.

@Abhishekvrshny
Copy link

+1 . Any update on this?

@emiel
Copy link
Author

emiel commented Oct 7, 2015

It looks like the reported response times are in milliseconds:

https://github.com/locustio/locust/blob/master/locust/clients.py#L116

@peterbe
Copy link

peterbe commented Nov 6, 2015

Ideally, I think it should be done with title tooltips. E.g.

<div class="something" title="34.5 milliseconds">34.5</div>

@fatso83
Copy link

fatso83 commented Jan 18, 2016

@peterbe That looks like a possible pull request from you. Am I right? :-)

@peterbe
Copy link

peterbe commented Jan 19, 2016

@fatso83 Sure, but I'm not working on (or use) locust any more and hence no time to commit to it.

@square-eyes
Copy link

These response times don't seem right anyway. I'm seeing long response times of 40 seconds. When I manually hit the web app during a test, I'm getting much shorter (2-3 seconds). Also, if I stop and start the swarm, they immediately resume the high count Showing >40 seconds. But how can that be if the test has just started. Is it because it's a rolling average? That doesn't explain the wild response times. I'm aggregating the stats with the Name= parameter, could this affect things?

@swordmaster2k
Copy link

Does this only effect the web interface or is it also a problem when running with --no-web?

+1 for it to be fixed.

@egherardini
Copy link

any news on this ?

@cgoldberg
Copy link
Member

  • stats in the UI are milliseconds, they are labeled "ms"
  • content size is in bytes (not labeled in the UI)

The docs could definitely be more helpful here.

@egherardini
Copy link

sorry I was unclear : I understand they are in ms, but they are unreliable. it happens more than often that after 5-6 seconds of a test start they start to rise above 80000 ms, which makes no sense since it's 10 times higher than the overall time the test was running.

I did the same performance tests with j meter (on the same system) and didn't experience this : just to add information which leads to a possible bug.

What do you think about? Is it possible there is some bug or am I missing something?

thanks and kind regards.

btw it is a great library, so immediate to create a stress test.

@cgoldberg
Copy link
Member

@egherardini,
that's unrelated to the issue reported by the OP... so I'm closing this issue.
feel free to open a new issue, but keep in mind that vague statements about reliability aren't helpful at all.

@narendraj9
Copy link

I think those numbers are not reliable at all.

@wolfch-elsevier
Copy link

I ran headless for 5 minutes. The aggregate report (*_stats.csv) said the "Average Response Time" was 14283 (no units mentioned, assuming mS). However the "Requests/s" was 39.45. So that doesn't jive at all. The 14 second average response time is WAY off - orders of magnitude off. The 39.45 requests/s seems about right, so that's about 25 mS response time, not 14283.

@heyman
Copy link
Member

heyman commented Jun 17, 2020

@wolfch-elsevier If you believe you've found a bug, please open a new issue with a minimal reproducible code example. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests