Skip to content
/ RoMa Public
forked from Parskatt/RoMa

[CVPR 2024] RoMa: Robust Dense Feature Matching; RoMa is the robust dense feature matcher capable of estimating pixel-dense warps and reliable certainties for almost any image pair.

License

Notifications You must be signed in to change notification settings

Cufix/RoMa

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

39 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RoMa 🏛️:
Robust Dense Feature Matching
⭐CVPR 2024⭐

Johan Edstedt · Qiyu Sun · Georg Bökman · Mårten Wadenbäck · Michael Felsberg


example
RoMa is the robust dense feature matcher capable of estimating pixel-dense warps and reliable certainties for almost any image pair.

Setup/Install

In your python environment (tested on Linux python 3.10), run:

pip install -e .

Demo / How to Use

We provide two demos in the demos folder. Here's the gist of it:

from roma import roma_outdoor
roma_model = roma_outdoor(device=device)
# Match
warp, certainty = roma_model.match(imA_path, imB_path, device=device)
# Sample matches for estimation
matches, certainty = roma_model.sample(warp, certainty)
# Convert to pixel coordinates (RoMa produces matches in [-1,1]x[-1,1])
kptsA, kptsB = roma_model.to_pixel_coordinates(matches, H_A, W_A, H_B, W_B)
# Find a fundamental matrix (or anything else of interest)
F, mask = cv2.findFundamentalMat(
    kptsA.cpu().numpy(), kptsB.cpu().numpy(), ransacReprojThreshold=0.2, method=cv2.USAC_MAGSAC, confidence=0.999999, maxIters=10000
)

New: You can also match arbitrary keypoints with RoMa. A demo for this will be added soon.

Settings

Resolution

By default RoMa uses an initial resolution of (560,560) which is then upsampled to (864,864). You can change this at construction (see roma_outdoor kwargs). You can also change this later, by changing the roma_model.w_resized, roma_model.h_resized, and roma_model.upsample_res.

Sampling

roma_model.sample_thresh controls the thresholding used when sampling matches for estimation. In certain cases a lower or higher threshold may improve results.

Reproducing Results

The experiments in the paper are provided in the experiments folder.

Training

  1. First follow the instructions provided here: https://github.com/Parskatt/DKM for downloading and preprocessing datasets.
  2. Run the relevant experiment, e.g.,
torchrun --nproc_per_node=4 --nnodes=1 --rdzv_backend=c10d experiments/roma_outdoor.py

Testing

python experiments/roma_outdoor.py --only_test --benchmark mega-1500

License

All our code except DINOv2 is MIT license. DINOv2 has an Apache 2 license DINOv2.

Acknowledgement

Our codebase builds on the code in DKM.

BibTeX

If you find our models useful, please consider citing our paper!

@article{edstedt2024roma,
title={{RoMa: Robust Dense Feature Matching}},
author={Edstedt, Johan and Sun, Qiyu and Bökman, Georg and Wadenbäck, Mårten and Felsberg, Michael},
journal={IEEE Conference on Computer Vision and Pattern Recognition},
year={2024}
}

About

[CVPR 2024] RoMa: Robust Dense Feature Matching; RoMa is the robust dense feature matcher capable of estimating pixel-dense warps and reliable certainties for almost any image pair.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 100.0%