Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[RLlib] Moving sampling coordination for batch_mode=complete_episodes to synchronous_parallel_sample. #46321

Conversation

simonsays1980
Copy link
Collaborator

@simonsays1980 simonsays1980 commented Jun 28, 2024

Why are these changes needed

When sampling complete episodes each EnvRunner sampled train_batch_size before returning. This made sampling inefficient and led to long waiting times in case slow environments were used. Furthermore, scaling could not reduce the workload in sampling. This PR changes this and moves coordination of the sampling when complete_episodes are needed fully to synchronous_parallel_sample that can coordinate better across all EnvRunners. This should reduce sampling duration linearly by the number of EnvRunners chosen.

Related issue number

Closes #45826

Checks

  • I've signed off every commit(by using the -s flag, i.e., git commit -s) in this PR.
  • I've run scripts/format.sh to lint the changes in this PR.
  • I've included any doc changes needed for https://docs.ray.io/en/master/.
    • I've added any new APIs to the API Reference. For example, if I added a
      method in Tune, I've added it in doc/source/tune/api/ under the
      corresponding .rst file.
  • I've made sure the tests are passing. Note that there might be a few flaky tests, see the recent failures at https://flakey-tests.ray.io/
  • Testing Strategy
    • Unit tests
    • Release tests
    • This PR is not tested :(

…ot reducing workload when scaled and b) was using

'train_batch_size' neglecting 'train_batch_size_per_learner'.

Signed-off-by: simonsays1980 <simon.zehnder@gmail.com>
@simonsays1980 simonsays1980 added bug Something that is supposed to be working; but isn't rllib RLlib related issues rllib-evaluation Bug affecting policy evaluation with RLlib. rllib-envrunners Issues around the sampling backend of RLlib labels Jun 28, 2024
@simonsays1980 simonsays1980 self-assigned this Jun 28, 2024
@sven1977 sven1977 changed the title [RLlib] - Moving sampling coordination for 'batch_mode=complete_episodes' to synchronous_parallel_sample. [RLlib] Moving sampling coordination for 'batch_mode=complete_episodes' to synchronous_parallel_sample. Jun 28, 2024
# For complete episodes mode, sample as long as the number of timesteps
# done is smaller than the `train_batch_size`.
# For complete episodes mode, sample a single episode and
# leave coordination of sampling to `synchronous_parallel_sample`.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I very much like this!

Can we add a small TODO comment here that this logic, currently handled by synchronous_parallel_sample will eventually be moved fully into EnvRunnerGroup? So from the algo, you would do:

if self.config.batch_mode == "complete_episodes"
    self.env_runner_group.sample(num_timesteps=[batch size], complete_episodes=True)

something like this ^. Don't have to do this in this PR!

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome. I would love this move!

Copy link
Contributor

@sven1977 sven1977 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice elegant cleanup PR. Thanks @simonsays1980 !

Just one comment line to be added.

@sven1977 sven1977 marked this pull request as ready for review June 28, 2024 11:15
Signed-off-by: simonsays1980 <simon.zehnder@gmail.com>
@simonsays1980 simonsays1980 changed the title [RLlib] Moving sampling coordination for 'batch_mode=complete_episodes' to synchronous_parallel_sample. [RLlib] Moving sampling coordination for batch_mode=complete_episodes to synchronous_parallel_sample. Jun 28, 2024
@sven1977 sven1977 enabled auto-merge (squash) June 28, 2024 11:59
@github-actions github-actions bot added the go add ONLY when ready to merge, run all tests label Jun 28, 2024
…th 'complete_episodes' sampling happens multiple times until the number of timesteps for the 'train_batch_size' is reached.

Signed-off-by: simonsays1980 <simon.zehnder@gmail.com>
@github-actions github-actions bot disabled auto-merge July 3, 2024 12:37
Signed-off-by: simonsays1980 <simon.zehnder@gmail.com>
@sven1977 sven1977 merged commit 3bdcab6 into ray-project:master Jul 4, 2024
6 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something that is supposed to be working; but isn't go add ONLY when ready to merge, run all tests rllib RLlib related issues rllib-envrunners Issues around the sampling backend of RLlib rllib-evaluation Bug affecting policy evaluation with RLlib.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

RLlib - The batch for module_id default_policy is empty!
2 participants