Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exclude no work executor #1307

Merged
merged 5 commits into from
Jan 14, 2020
Merged

Exclude no work executor #1307

merged 5 commits into from
Jan 14, 2020

Conversation

cuonglm
Copy link
Contributor

@cuonglm cuonglm commented Jan 10, 2020

For executor which has nothing to do in a given segment, we should not
include it in the list of executors.

To do it, add new method HasWork to ExecutorConfig. By filtering out no
work executor when creating new schedulter, we can prevents any un-necessary
works and provide better UX.

Fixes #1295

For executor which has nothing to do in a given segment, we should not
include it in the list of executors.

To do it, add new method HasWork to ExecutorConfig. By filtering out no
work executor when creating new schedulter, we can prevents any un-necessary
works and provide better UX.

Fixes #1295
@codecov-io
Copy link

codecov-io commented Jan 10, 2020

Codecov Report

Merging #1307 into new-schedulers will increase coverage by 0.07%.
The diff coverage is 72.22%.

Impacted file tree graph

@@                Coverage Diff                @@
##           new-schedulers   #1307      +/-   ##
=================================================
+ Coverage           74.92%     75%   +0.07%     
=================================================
  Files                 159     159              
  Lines               12288   12311      +23     
=================================================
+ Hits                 9207    9234      +27     
+ Misses               2590    2587       -3     
+ Partials              491     490       -1
Impacted Files Coverage Δ
lib/executors.go 90.74% <ø> (+7.4%) ⬆️
lib/executor/variable_arrival_rate.go 96.05% <0%> (-0.96%) ⬇️
cmd/run.go 10.04% <0%> (ø) ⬆️
lib/executor/externally_controlled.go 1.06% <0%> (-0.01%) ⬇️
lib/executor/shared_iterations.go 96.93% <100%> (+0.06%) ⬆️
lib/executor/per_vu_iterations.go 94.38% <100%> (+0.12%) ⬆️
lib/executor/constant_looping_vus.go 96.38% <100%> (+0.08%) ⬆️
lib/executor/constant_arrival_rate.go 96.89% <100%> (+0.04%) ⬆️
lib/executor/variable_looping_vus.go 91.7% <100%> (+1%) ⬆️
core/local/local.go 69.66% <88.88%> (+0.36%) ⬆️
... and 3 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f6be35c...e573c02. Read the comment docs.

lib/executor/externally_controlled.go Outdated Show resolved Hide resolved
core/local/local.go Outdated Show resolved Hide resolved
core/local/local.go Outdated Show resolved Hide resolved
core/local/local.go Outdated Show resolved Hide resolved
core/local/local.go Outdated Show resolved Hide resolved
core/local/local_test.go Outdated Show resolved Hide resolved
lib/executors.go Outdated Show resolved Hide resolved
lib/executor/constant_arrival_rate.go Outdated Show resolved Hide resolved
cmd/run.go Outdated Show resolved Hide resolved
na--
na-- previously approved these changes Jan 13, 2020
imiric
imiric previously approved these changes Jan 13, 2020
Copy link
Contributor

@imiric imiric left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

mstoykov
mstoykov previously approved these changes Jan 14, 2020
@na-- na-- dismissed stale reviews from mstoykov, imiric, and themself via e573c02 January 14, 2020 09:43
logger.Warnf(
"Executor '%s' is disabled for segment %s due to lack of work!",
sc.GetName(), options.ExecutionSegment,
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am wondering if instead of

WARN[0000] Executor 'shared_iters1' is disabled for segment 1/2:3/4 due to lack of work!
   executor: local
     output: -
     script: 1295.js

  execution: (25.00%) 3 executors, 2 max VUs, 10m30s max duration (incl. graceful stop):
           * constant_arr_rate: 0.75 iterations/s for 20s (maxVUs: 1, gracefulStop: 30s)
           * shared_iters1: 0 iterations shared among 0 VUs (maxDuration: 10m0s, gracefulStop: 30s)
           * shared_iters2: 1 iterations shared among 1 VUs (maxDuration: 10m0s, gracefulStop: 30s)


running (00m20.1s), 0/2 VUs, 15 complete and 0 interrupted iterations
constant_arr_rate [======================================] 0.75 iters/s, 1 out of 1 VUs active done!
shared_iters2 [======================================] 1/1 shared iters among 1 VUs done!


    data_received........: 0 B 0 B/s
    data_sent............: 0 B 0 B/s
    iteration_duration...: avg=33.41µs min=4.91µs med=24.81µs max=87.14µs p(90)=74.71µs p(95)=82.18µs
    iterations...........: 15  0.745541/s
    vus..................: 1   min=1 max=1

it would be better to be just before the progressbars

   executor: local
     output: -
     script: 1295.js

  execution: (25.00%) 3 executors, 2 max VUs, 10m30s max duration (incl. graceful stop):
           * constant_arr_rate: 0.75 iterations/s for 20s (maxVUs: 1, gracefulStop: 30s)
           * shared_iters1: 0 iterations shared among 0 VUs (maxDuration: 10m0s, gracefulStop: 30s)
           * shared_iters2: 1 iterations shared among 1 VUs (maxDuration: 10m0s, gracefulStop: 30s)

WARN[0000] Executor 'shared_iters1' is disabled for segment 1/2:3/4 due to lack of work!

running (00m20.1s), 0/2 VUs, 15 complete and 0 interrupted iterations
constant_arr_rate [======================================] 0.75 iters/s, 1 out of 1 VUs active done!
shared_iters2 [======================================] 1/1 shared iters among 1 VUs done!


    data_received........: 0 B 0 B/s
    data_sent............: 0 B 0 B/s
    iteration_duration...: avg=33.41µs min=4.91µs med=24.81µs max=87.14µs p(90)=74.71µs p(95)=82.18µs
    iterations...........: 15  0.745541/s
    vus..................: 1   min=1 max=1

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it will be a bit better, yeah.... though that would require some very refactoring that I'm not sure I want to do right now... basically, we'd have to reorder a lot of cmd/run.go to achieve this 😞

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

okay let's leave it for a future PR

@na-- na-- merged commit e3ac2e0 into new-schedulers Jan 14, 2020
@na-- na-- deleted the fix/issue-1295 branch January 14, 2020 10:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants