LoFTR

Code for "LoFTR: Detector-Free Local Feature Matching with Transformers", CVPR 2021

LoFTR: Detector-Free Local Feature Matching with Transformers
Jiaming Sun*, Zehong Shen*, Yu'ang Wang*, Hujun Bao, Xiaowei Zhou
CVPR 2021

loftr-github-demo

TODO List and ETA

The entire codebase for data pre-processing, training and validation is under major refactoring and will be released around June.
Please subscribe to this discussion thread if you wish to be notified of the code release.
In the meanwhile, discussions about the paper are welcomed in the discussion panel.

  • [x] Inference code and pretrained models (DS and OT) (2021-4-7)
  • [x] Code for reproducing the test-set results (2021-4-7)
  • [ ] Webcam demo to reproduce the result shown in the GIF above (expected 2021-4-13)
  • [ ] Training code and training data preparation (expected 2021-6-10)

Installation

# For full pytorch-lightning trainer features
conda env create -f environment.yaml
conda activate loftr

# For the LoFTR matcher only
pip install torch einops yacs kornia

We provide the download link to

  • the scannet-1500-testset (~1GB).
  • the megadepth-1500-testset (~600MB).
  • 4 pretrained models of indoor-ds, indoor-ot, outdoor-ds and outdoor-ot (each ~45MB).

By now, the LoFTR-DS model is ready to go!

[Requirements for LoFTR-OT]

We use the code from SuperGluePretrainedNetwork for optimal transport. However, we can't provide the code directly due its strict LICENSE requirements. We recommend downloading it with the following command instead.

cd src/loftr/utils  
wget https://raw.githubusercontent.com/magicleap/SuperGluePretrainedNetwork/master/models/superglue.py 

Run the code

Match image pairs with LoFTR

[code snippets]

from src.loftr import LoFTR, default_cfg

# Initialize LoFTR
matcher = LoFTR(config=default_cfg)
matcher.load_state_dict(torch.load("weights/indoor_ds.ckpt")['state_dict'])
matcher = matcher.eval().cuda()

# Inference
with torch.no_grad():
    matcher(batch)    # batch = {'image0': img0, 'image1': img1}
    mkpts0 = batch['mkpts0_f'].cpu().numpy()
    mkpts1 = batch['mkpts1_f'].cpu().numpy()

An example is in the notebooks/demo_single_pair.ipynb.

Reproduce the testing results with pytorch-lightning

conda activate loftr
# with shell script
bash ./scripts/reproduce_test/indoor_ds.sh

# or
python test.py configs/data/scannet_test_1500.py configs/loftr/loftr_ds.py --ckpt_path weights/indoor_ds.ckpt --profiler_name inference --gpus=1 --accelerator="ddp"

For visualizing the dump results, please refer to notebooks/visualize_dump_results.ipynb.

Citation

If you find this code useful for your research, please use the following BibTeX entry.

@article{sun2021loftr,
  title={{LoFTR}: Detector-Free Local Feature Matching with Transformers},
  author={Sun, Jiaming and Shen, Zehong and Wang, Yuang and Bao, Hujun and Zhou, Xiaowei},
  journal={CVPR},
  year={2021}
}

GitHub

https://github.com/zju3dv/LoFTR