-
Notifications
You must be signed in to change notification settings - Fork 798
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add docker-compose
support for easier and more portable environment set-up
#519
base: main
Are you sure you want to change the base?
Conversation
name: lerobot | ||
|
||
services: | ||
lerobot: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In future PR's we could split up this service to be lerobot-x86
and lerobot-arm
to support multiple platforms (like jetson).
For now, I think let's see how people use this, improve the experience, then tackle multi-arch support in the future.
@Cadene I think this is ready for review. I'd love to merge it, I'd be happy take responsibility and be the code owner for docker issues, and improve the as people bring in feedback. I've spent the last two weeks developing in this branch, using docker exclusively. I'm pretty happy with the experience. Everything works great, including integration with huggingface, and wandb. I've recorded, trained, teleoped, and visualized without issues. Display-forwarding works great. That annoying bug with opencv/ffmpeg/pyav dependencies conflicting is also fixed! Let me know what you think! |
What this does
This PR enables users to easily build and enter a docker container which has lerobot and it's dependencies installed.
Installing lerobot is now as easy as:
and that plops you right into a container within a tested lerobot environment.
Features:
Why:
control_robot
record functionality expects that I will want to save my video usinglibsvtav1
(seevideo_utils.encode_video_frames
). However, for whatever reason my version of ffmpeg doesn't allow that, and I had to manually edit code to get off the ground.Which froze my development, since I wanted to run
imshow
and check out the cameras when debugging policies.Anyways, I think it'd be exciting to give users another way to install dependencies.
I don't want to get too far into this without some buy-in. Please let me know if this is interesting, and would be wanted. Cheers!
How it was tested
If folks agree this is desired, I'll add a CI pipeline that validates the build.
Personally, I'm still working on testing this. I'll report back after feedback from reviewers.
Here's what I've personally tested, so far:
How to checkout & try? (for the reviewer)
Just go ahead and check out my fork, and run the commands I documented in the README. I'm most interested in tests done in different hardware. I was thinking of supporting GPU and CPU cases, but is it worth supporting ARM images (a la Jetson) as well?
TODO:
For a future PR
docker run lerobot
or 'docker run lerobot-arm'. It would be nice to have easy set-up on jetson devices!