Skip to content

ONNX-compatible LightGlue: Local Feature Matching at Light Speed. Supports TensorRT, OpenVINO

License

Notifications You must be signed in to change notification settings

fabio-sim/LightGlue-ONNX

 
 

Repository files navigation

English | 简体中文 | 日本語

ONNX TensorRT GitHub Repo stars GitHub all releases Blog

LightGlue ONNX

Open Neural Network Exchange (ONNX) compatible implementation of LightGlue: Local Feature Matching at Light Speed. The ONNX model format allows for interoperability across different platforms with support for multiple execution providers, and removes Python-specific dependencies such as PyTorch. Supports TensorRT and OpenVINO.

What's New: End-to-end parallel dynamic batch size support. Read more in this blog post.

Latency Comparison
⏱️ Inference Time Comparison

LightGlue figure

Changelog
  • 17 July 2024: End-to-end parallel dynamic batch size support. Revamp script UX. Add blog post.
  • 02 November 2023: Introduce TopK-trick to optimize out ArgMax for about 30% speedup.
  • 04 October 2023: Fused LightGlue ONNX Models with support for FlashAttention-2 via onnxruntime>=1.16.0, up to 80% faster inference on long sequence lengths (number of keypoints).
  • 27 October 2023: LightGlue-ONNX added to Kornia!
  • 04 October 2023: Multihead-attention fusion optimization.
  • 19 July 2023: Add support for TensorRT.
  • 13 July 2023: Add support for Flash Attention.
  • 11 July 2023: Add support for mixed precision.
  • 04 July 2023: Add inference time comparisons.
  • 01 July 2023: Add support for extractor max_num_keypoints.
  • 30 June 2023: Add support for DISK extractor.
  • 28 June 2023: Add end-to-end SuperPoint+LightGlue export & inference pipeline.

⭐ ONNX Export & Inference

We provide a typer CLI dynamo.py to easily export LightGlue to ONNX and perform inference using ONNX Runtime. If you would like to try out inference right away, you can download ONNX models that have already been exported here.

$ python dynamo.py --help

Usage: dynamo.py [OPTIONS] COMMAND [ARGS]...

LightGlue Dynamo CLI

╭─ Commands ───────────────────────────────────────╮
│ export   Export LightGlue to ONNX.               │
│ infer    Run inference for LightGlue ONNX model. │
| trtexec  Run pure TensorRT inference using       |
|          Polygraphy.                             |
╰──────────────────────────────────────────────────╯

Pass --help to see the available options for each command. The CLI will export the full extractor-matcher pipeline so that you don't have to worry about orchestrating intermediate steps.

📖 Example Commands

🔥 ONNX Export
python dynamo.py export superpoint \
  --num-keypoints 1024 \
  -b 2 -h 1024 -w 1024 \
  -o weights/superpoint_lightglue_pipeline.onnx
⚡ ONNX Runtime Inference (CUDA)
python dynamo.py infer \
  weights/superpoint_lightglue_pipeline.onnx \
  assets/sacre_coeur1.jpg assets/sacre_coeur2.jpg \
  superpoint \
  -h 1024 -w 1024 \
  -d cuda
🚀 ONNX Runtime Inference (TensorRT)
python dynamo.py infer \
  weights/superpoint_lightglue_pipeline.trt.onnx \
  assets/sacre_coeur1.jpg assets/sacre_coeur2.jpg \
  superpoint \
  -h 1024 -w 1024 \
  -d tensorrt --fp16
🧩 TensorRT Inference
python dynamo.py trtexec \
  weights/superpoint_lightglue_pipeline.trt.onnx \
  assets/sacre_coeur1.jpg assets/sacre_coeur2.jpg \
  superpoint \
  -h 1024 -w 1024 \
  --fp16
🟣 ONNX Runtime Inference (OpenVINO)
python dynamo.py infer \
  weights/superpoint_lightglue_pipeline.onnx \
  assets/sacre_coeur1.jpg assets/sacre_coeur2.jpg \
  superpoint \
  -h 512 -w 512 \
  -d openvino

Credits

If you use any ideas from the papers or code in this repo, please consider citing the authors of LightGlue and SuperPoint and DISK. Lastly, if the ONNX versions helped you in any way, please also consider starring this repository.

@inproceedings{lindenberger23lightglue,
  author    = {Philipp Lindenberger and
               Paul-Edouard Sarlin and
               Marc Pollefeys},
  title     = {{LightGlue}: Local Feature Matching at Light Speed},
  booktitle = {ArXiv PrePrint},
  year      = {2023}
}
@article{DBLP:journals/corr/abs-1712-07629,
  author       = {Daniel DeTone and
                  Tomasz Malisiewicz and
                  Andrew Rabinovich},
  title        = {SuperPoint: Self-Supervised Interest Point Detection and Description},
  journal      = {CoRR},
  volume       = {abs/1712.07629},
  year         = {2017},
  url          = {http://arxiv.org/abs/1712.07629},
  eprinttype    = {arXiv},
  eprint       = {1712.07629},
  timestamp    = {Mon, 13 Aug 2018 16:47:29 +0200},
  biburl       = {https://dblp.org/rec/journals/corr/abs-1712-07629.bib},
  bibsource    = {dblp computer science bibliography, https://dblp.org}
}
@article{DBLP:journals/corr/abs-2006-13566,
  author       = {Michal J. Tyszkiewicz and
                  Pascal Fua and
                  Eduard Trulls},
  title        = {{DISK:} Learning local features with policy gradient},
  journal      = {CoRR},
  volume       = {abs/2006.13566},
  year         = {2020},
  url          = {https://arxiv.org/abs/2006.13566},
  eprinttype    = {arXiv},
  eprint       = {2006.13566},
  timestamp    = {Wed, 01 Jul 2020 15:21:23 +0200},
  biburl       = {https://dblp.org/rec/journals/corr/abs-2006-13566.bib},
  bibsource    = {dblp computer science bibliography, https://dblp.org}
}