This is the official implementation for testing depth estimation using the model proposed in
R-MSFM: Recurrent Multi-Scale Feature Modulation for Monocular Depth Estimating Zhongkai Zhou, Xinnan Fan, Pengfei Shi, Yuanxue Xin
R-MSFM can estimate a depth map from a single image.
Paper is now available at ICCV2021
We will update the training code in the future, if there is any problem before then, please contact us. In addition, you can also refer to this project for the training code, which achieves better results than mine. Thank you for his nice code!
Paper is now available at T-PAMI2024. In this paper, we improve our R-MSFM from two aspects and achieve SOTA results.
- We propose another lightweight convolutional network R-MSFMX that evolved from our R-MSFM to better address the problem of depth estimation. Our R-MSFMX takes the first three blocks from ResNet50 instead of ResNet18 in our R-MSFM and further improves the depth accuracy.
- We promote the geometry consistent depth learning for both our R-MSFM and R-MSFMX, which prevents the depth artifacts at object borders and thus generates more consistent depth. We denote the models that perform geometry consistent depth estimation by the postfix (GC).
We show the superiority of our R-MSFMX-GC as follows:
The rows (from up to bottom) are RGB images, and the results by Monodepth2, R-MSFM6, and the improved version R-MSFMX6-GC.
We have updated all the results as follows: results
We have updated all the results as follows: models
You can predict scaled disparity for a single image used R-MSFM3 with:
python test_simple.py --image_path='path_to_image' --model_path='path_to_model' --update=3
or R-MSFMX3 with
python test_simple.py --image_path='path_to_image' --model_path='path_to_model' --update=3 --x
or R-MSFM6 with:
python test_simple.py --image_path='path_to_image' --model_path='path_to_model' --update=6
or R-MSFM6X with:
python test_simple.py --image_path='path_to_image' --model_path='path_to_model' --update=6 --x
The codes are based on RAFT, Monodepth2. Please also follow their licenses. Thanks for their great works.