This repository is the official implementation of the paper:
Learning Object Properties Using Robot Proprioception via Differentiable Robot-Object Interaction
Peter Yichen Chen, Chao Liu, Pingchuan Ma, John Eastman, Daniela Rus, Dylan Randle, Yuri Ivanov, Wojciech Matusik
MIT CSAIL, Amazon Robotics, University of British Columbia
International Conference on Robotics and Automation (ICRA), 2025
A big shoutout to the Nvidia Warp team! Warp integrates effortlessly with Torch, streamlining the use of differentiable simulation for Torch-based optimization workflows.
Install the required packages first:
pip install -r requirements.txt
For visualization, install these optional packages:
- bpy
- blendertoolbox
and these softwares:
- Blender
- ffmpeg
To calibrate object properties, use the following command:
python train.py --config-name hard_ballTo evaluate the calibrated object property, use the following command:``
python eval.py --config-name hard_ball ckpt=experiments/log/robotis_2_hard_ball/open_manipulator/open_manipulator_joint2_only_v2/train/training_stats.pt ckpt_idx=8To visualize the robot, use the following command:
python render_usd.py --usd-path experiments/log/robotis_2_hard_ball/open_manipulator/open_manipulator_joint2_only_v2/test/test_ckpt_idx_0008.usd@misc{chen2025learningobjectpropertiesusing,
title={Learning Object Properties Using Robot Proprioception via Differentiable Robot-Object Interaction},
author={Peter Yichen Chen and Chao Liu and Pingchuan Ma and John Eastman and Daniela Rus and Dylan Randle and Yuri Ivanov and Wojciech Matusik},
year={2025},
eprint={2410.03920},
archivePrefix={arXiv},
primaryClass={cs.RO},
url={https://arxiv.org/abs/2410.03920},
}
