NormalFlow: Fast, Robust, and Accurate
Contact-based Object 6DoF Pose Tracking
with Vision-based Tactile Sensors
NormalFlow is a tactile-based object tracking algorithm that is significantly more accurate and robust compared to other approaches and performs well even on low-textured objects like a table tennis ball, an egg or a flat table. It operates at 70 Hz on a standard CPU. For additional details and results, please visit our website and the RA-L paper.
- Tested on Ubuntu 22.04
- Tested on GelSight Mini and Digit
- Python >= 3.9
- For the demo and example, install gs_sdk.
Clone and install normalflow from source:
git clone git@github.com:rpl-cmu/normalflow.git
cd normalflow
pip install -e .
Real-time tracking demo, filmed live without post-processing
Connect a GelSight Mini sensor (without markers) to your machine and run the command below to start a real-time object tracking demo.
realtime_object_tracking [-d {cpu|cuda}]
After starting, wait a few seconds for a window to appear. Tracking will begin once an object contacts the sensor. Press any key to exit.
- Note: For other GelSight sensors, please use the GelSight SDK Calibration tool to calibrate. Supply the configuration file and calibrated model path as arguments to run the demo with other GelSight sensors.
- Note: This demo also serves as an implementation of the long-horizon tracking algorithm presented in the RA-L paper.
This example demonstrates basic usage of NormalFlow. Run the command below to test the tracking algorithm.
test_tracking [-d {cpu|cuda}]
The command reads the tactile video examples/data/tactile_video.avi
, tracks the touched object, and saves the result in examples/data/tracked_tactile_video.avi
.
The normalflow
function in normalflow/registration.py
implements frame-to-frame NormalFlow tracking, returning the homogeneous transformation from a reference sensor frame to a target sensor frame (see figure below). If tracking fails, it raises InsufficientOverlapError
. For usage, see examples/test_tracking.py
and demos/realtime_object_tracking.py
.
To reproduce the main results from our paper, which compares NormalFlow with baseline algorithms, please visit the NormalFlow Experiment repository.
If you find this package useful, please consider citing our paper:
@ARTICLE{huang2024normalflow,
author={Huang, Hung-Jui and Kaess, Michael and Yuan, Wenzhen},
journal={IEEE Robotics and Automation Letters},
title={NormalFlow: Fast, Robust, and Accurate Contact-based Object 6DoF Pose Tracking with Vision-based Tactile Sensors},
year={2024},
volume={},
number={},
pages={1-8},
keywords={Force and Tactile Sensing, 6DoF Object Tracking, Surface Reconstruction, Perception for Grasping and Manipulation},
doi={10.1109/LRA.2024.3505815}}