Revisiting Disparity from Dual-Pixel Images: Physics-Informed Lightweight Depth Estimation (WACV 2025)
Teppei Kurita, Yuhi Kondo, Legong Sun, Takayuki Sasaki, Sho Nitta, Yasuhiro Hashimoto, Yoshinori Muramatsu and Yusuke Moriuchi
Sony Semiconductor Solutions Corporation, Tokyo, Japan
paper(arxiv) | dataset
The dependencies required to run this project are as follows:
- Python 3.7+
- PyTorch 1.9.1
- CUDA 11.1
To build the Docker image and use the program, follow these steps:
- Build the Docker image:
docker build -t dual-pixel-disparity:latest .
- Run the Docker container:
docker run --gpus all -v $(pwd)/data:/workspace/data -it dual-pixel-disparity:latest
- Inside the Docker container, you can use the same commands for training and evaluation as mentioned above.
An example of the project's directory structure is as follows:
./dual-pixel-disparity/
./data/
├── models/
│ ├── costdcnet-based.tar
│ └── nlspn-based.tar
├── results/
├── dataset/
│ ├── Punnappurath_ICCP2020/
│ │ └── test/
│ ├── SEDC/
│ │ ├── train/
│ │ └── val/
The Punnappurath_ICCP2020
dataset can be downloaded from the following link: dual-pixel-defocus-disparity
Our Synthetic Edge Depth Completion Dataset (SEDC Dataset)
can be downloaded from the following link: SEDC Dataset
*Access to data requires a Microsoft account. After creating your Microsoft account, please contact us with your Microsoft E-mail address to grant access. The access right is revoked after a certain period of time , and your account information is not retained.
Our pretrained model can be downloaded from the following link.
*Access to data requires a Microsoft account. After creating your Microsoft account, please contact us with your Microsoft E-mail address to grant access. The access right is revoked after a certain period of time , and your account information is not retained.
The latest models costdcnet-based_250127.tar
and nlspn-based_250127.tar
have improved performance compared to the quantitative evaluation values in the paper.
Use the following command to start training the model:
python main.py --data-type ed --depth-to-phase --add-phase-noise --network-model c --network-variant costdcnet --criterion l1c --epochs 50 --batch-size 8 --data-folder ../data/dataset/SEDC/ --result ../data/results/
Use the following command to evaluate an existing model:
python main.py --evaluate ../data/model/costdcnet-based.tar --data-type cdp --network-model c --network-variant costdcnet --epochs 50 --batch-size 4 --data-folder ../data/dataset/Punnappurath_ICCP2020/ --result ../data/results/ --vis-depth-min 0.0 --vis-depth-max 3.0 --vis-phase-min -8.0 --vis-phase-max 5.0 --test-with-gt --lowres-phase --lowres-pscale 0.5 --lowres-cnn --lowres-scale 0.5 --post-process --post-refine wfgs --wfgs-conf --wfgs-prefill
For evaluation, please place dp_matching
in ./dual-pixel-disparity/utils/
. It can be downloaded from the following link:
dp_matching
*Access to data requires a Microsoft account. After creating your Microsoft account, please contact us with your Microsoft E-mail address to grant access. The access right is revoked after a certain period of time , and your account information is not retained.
dp_matching
is an executable file.
This software is released under the MIT License. See LICENSE for details.
@InProceedings{Kurita_2025_WACV,
author = {Kurita, Teppei and Kondo, Yuhi and Sun, Legong and Sasaki, Takayuki and Hashimoto, Yasuhiro and Muramatsu, Yoshinori and Moriuchi, Yusuke},
title = {Revisiting Disparity from Dual-Pixel Images: Physics-Informed Lightweight Depth Estimation},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
month = {February},
year = {2025},
}