Code for "LGFCTR: Local and Global Feature Convolutional Transformer for Image Matching"
Since the preprocessed undistorted MegaDepth dataset provided in D2-Net has been not available, we use the original MegaDepth dataset
Download and unzip MegaDepth indices following LoFTR
Symlink the datasets and indices to the data
directory following LoFTR
We provide the outdoor weights of LGFCTR in the Google Drive.
Please follow LoFTR to prepare the environment.
In addition, you can also install the requirements by yourselves. More specifically, we use pytorch==1.9.1+cu111, pytorch-lightning==1.3.5, opencv-python==4.5.5.64, torchmetrics==0.6.0 and kornia==0.6.11. Other requirements can be installed easily by pip.
conda env create -f environment.yaml
conda activate lgfctr
You can reproduce the training by
sh scripts/reproduce_train/outdoor_ds.sh
You can reproduce the evaluation on MegaDepth dataset by
sh scripts/reproduce_test/outdoor_ds.sh
We provide a demo for visualizing a single pair of images. You can specify img_path0
and img_path1
for your images, save_dir
for your save directory, topk
for the number of matches shown, img_resize
for resized longer dimension, and is_original
for outputing the original images.
cd vis
python vis_single_pair.py --img_path0 your_img_path0 --img_path1 your_img_path1 --save_dir your_save_dir --topk 1000 --img_resize 640 --is_original True
We provide a demo for visualizing multi-scale attention weights of a single pair of images. Besides arguments mentioned above, you can specify dpi
for the dpi of outputs, and change the Line 41 to specify which index of resolutions and CTR for visualizations.
python vis_attention.py
This repository was developed from LoFTR, and we are grateful for their implementations.
If you find this code useful for your research, please use the following BibTeX entry.
@article{zhong2023lgfctr,
title={LGFCTR: Local and Global Feature Convolutional Transformer for Image Matching},
author={Zhong, Wenhao and Jiang, Jie},
journal={arXiv preprint arXiv:2311.17571},
year={2023}
}