Skip to content

Latest commit

 

History

History
12 lines (9 loc) · 530 Bytes

README.md

File metadata and controls

12 lines (9 loc) · 530 Bytes

Distributed data collection examples

If your algorithm is bound by the data collection speed, you may consider using distributed data collector to make your training faster. TorchRL offers a bunch of distributed data collectors that you can use to increase the collection speed tenfold or more.

These examples are divided in a single machine and a multi-node series.

Refer to the documentation for more insight on what you can expect do and how these tools should be used.