- Event-based Stereo Visual Odometry, Yi Zhou, Guillermo Gallego, Shaojie Shen, IEEE Transactions on Robotics (T-RO), 37(5):1433-1450, 2021.
- Semi-dense 3D Reconstruction with a Stereo Event Camera, Yi Zhou, Guillermo Gallego, Henri Rebecq, Laurent Kneip, Hongdong Li, Davide Scaramuzza, ECCV 2018.
- IMU-Aided Event-based Stereo Visual Odometry, Junkai Niu, Sheng Zhong, Yi Zhou, ICRA 2024. Video link
We improve our previous direct pipeline Event-based Stereo Visual Odometry (refer to the ESVO Project Page) in terms of accuracy and efficiency.
In this work, we achieve a large improvement in trajectory accuracy on the DSEC dataset.
We have tested ESVO on machines with the following configurations
- Ubuntu 18.04.5 LTS + ROS melodic + gcc 5.5.0 + cmake (>=3.10) + OpenCV 3.2
- Ubuntu 20.04 LTS + ROS Noetic + OpenCV 4.2
To work with event cameras, especially for the Dynamic Vision Sensors (DVS/DAVIS), you need to install some drivers. Please follow the instructions (steps 1-9) at rpg_dvs_ros before moving on to the next step. Note that you need to replace the name of the ROS distribution with the one installed on your computer.
We use catkin tools to build the code. You should have it installed during the driver installation.
You should have created a catkin workspace in Section 3.1. If not, please go back and create one.
Clone this repository into the src
folder of your catkin workspace.
cd ~/catkin_ws/src
git clone https://github.com/NAIL-HNU/ESVIO_AA.git
Then clone the required dependency packages
cd ~/catkin_ws/src
git clone https://github.com/catkin/catkin_simple.git
git clone https://github.com/uzh-rpg/rpg_dvs_ros.git
git clone https://github.com/ethz-asl/gflags_catkin.git
git clone https://github.com/ethz-asl/glog_catkin.git
git clone https://github.com/ethz-asl/minkindr.git
git clone https://github.com/ethz-asl/eigen_catkin.git
git clone https://github.com/ethz-asl/eigen_checks.git
git clone https://github.com/ethz-asl/minkindr_ros.git
git clone https://github.com/ethz-asl/catkin_boost_python_buildtool.git
git clone https://github.com/ethz-asl/numpy_eigen.git
If you don't have a yaml, please install one. But if you already have the yaml library, please do not install it repeatedly, as it will cause version conflicts.
# if you don't have a yaml
cd ~/catkin_ws/src
git clone https://github.com/jbeder/yaml-cpp.git
cd yaml-cpp
mkdir build && cd build && cmake -DYAML_BUILD_SHARED_LIBS=ON ..
make -j
Finally compile it.
cd ~/catkin_ws
catkin_make
Compared with ESVO, we have accelerated the generation speed of time surface. You can try to run it using the following commands.
cd ~/catkin_ws
source devel/setup.bash
roslaunch esvio_image_representation esvio_image_representation_stereo_AA.launch
Since the rosbag with event and IMU data is not given in the DSEC data set, we package the required data as the input of the system.
You can get part of the rosbag we repacked through the zurich_city_04a_download link and the zurich_city_04b_download link. In addition, if you need repacked packages for other sequences of zurich_city_04 or zurich_city_11, please contact us.
After you get the repackaged data, you can try running it using the following command.
cd ~/catkin_ws
source devel/setup.bash
roslaunch esvo_core system_dsec.launch
We welcome comparative evaluation before this project is open sourced. The original trajectories estimated by ESVO (Block Matching using 10 thousand events) and ESVIO_AA (Block Matching using 5 thousand events) on DSEC dataset have been uploaded.
Among them, stamped_groundtruth_alignment.txt
is the trajectory output by the lidar algorithm rotated to the camera system.
For questions or inquiries, please feel free to contact us at JunkaiNiu@hnu.edu.cn or eeyzhou@hnu.edu.cn.
We appreciate your interest in our work!