Skip to content
Matia Pizzoli edited this page Dec 1, 2015 · 20 revisions

REMODE assumes the pose of the camera to be known.

The Semi-Direct Monocular Visual Odometry (SVO) can be used to estimate the pose of a calibrated camera from a live image stream. The approach is described in the paper:

http://rpg.ifi.uzh.ch/docs/ICRA14_Forster.pdf

An open source implementation is available here:

https://github.com/pizzoli/rpg_svo.git

Please refer to the wiki in https://github.com/uzh-rpg/rpg_svo/wiki for instructions on how to build and run the code.

NOTE: please make sure you read the license conditions accompanying the source code.

The ROS node running the REMODE pipeline subscribes to the topic

/svo/dense_input

You can learn the details about the message format from the files depthmap_node.cpp and test/publish_dataset.cpp.

SVO publishes the /svo/dense_input topic if you specify the following in the SVO launch file:

<!-- Dense Input -->
<param name="publish_dense_input" value="True" />
<param name="publish_every_nth_dense_input" value="5" />

The latter parameter controls the frequency the /svo/dense_input topic is published, in this case once over five processed frames.

Start the camera stream, run and initialize SVO (please refer to the SVO wiki https://github.com/uzh-rpg/rpg_svo/wiki).

Launch the REMODE node using a launch file containing the camera calibration parameters. For instance:

roslaunch rpg_open_remode bluefox_25001262.launch

You can refer to the file launch/bluefox_25001262.launch for an example of launch file. The following camera parameters are required in order to execute REMODE. Please replace the values with the result of your camera calibration. The parameters are consistent with the OpenCV format http://docs.opencv.org/2.4/doc/tutorials/calib3d/camera_calibration/camera_calibration.html:

<param name="cam_width" value="752" />
<param name="cam_height" value="480" />

<param name="cam_fx" value="350.3755718120993" />
<param name="cam_cx" value="398.2742873637509" />
<param name="cam_fy" value="349.15163791825313" />
<param name="cam_cy" value="247.7956472959842" />

<param name="cam_k1" value="-0.24099444659398922" />
<param name="cam_k2" value="0.04694396667448327" />
<param name="cam_r1" value="0.0007325506238530427" />
<param name="cam_r2" value="0.00044374959782553903" />

The parameter

<param name="ref_compl_perc" value="30.0" />

controls the taking of reference frames: in the example, if the number of converged measurements is at least 30% of the image pixels, the REMODE node publishes the current depth estimation and takes a new reference image.

The REMODE node publishes the resulting depthmap as a 32FC1 image. You can visualize it by running image_view and subscribing to the remode/depth topic:

rosrun image_view image_view image:=/remode/depth

A point cloud is also published for every reference frame. You can visualize it in RVIZ, by subscribing to the topic remode/pointcloud. The file open_remode.rviz is a simple RVIZ configuration. Run RVIZ as follows:

rosrun rviz rviz -d $MY_CATKIN_WORKSPACE/src/rpg_open_remode/open_remode.rviz

Example: run the SVO - REMODE pipeline on a ROS bag file

Download the dataset from http://rpg.ifi.uzh.ch/datasets/px4_2_fly_over_debris.bag

Launch SVO with the proper parameters

NOTE: the parameters are available from the SVO fork https://github.com/pizzoli/rpg_svo.git

roslaunch svo_ros px4_2.launch

Launch REMODE

roslaunch rpg_open_remode px4_2.launch

Run RVIZ

rosrun rviz rviz -d $MY_CATKIN_WORKSPACE/src/rpg_open_remode/open_remode.rviz

Play the dataset in the px4_2_fly_over_debris.bag file (the test bag file you just downloaded).

You should be able to see in RVIZ the images from the camera (mounted down-looking on a micro aerial vehicle), the image features tracked by SVO to estimate the camera pose, the depth maps computed by REMODE and the accumulated point cloud.

PX4-2 flying over debris

The topic /remode/convergence shows the convergence status for the reference image: blue pixels identify converged measurements. You can display it by executing

rosrun image_view image_view image:=/remode/convergence
Clone this wiki locally