#artificial-intelligence #machine-learning #onnx-runtime

no-std ort

A safe Rust wrapper for ONNX Runtime 1.22 - Optimize and accelerate machine learning inference & training

19 releases (4 stable)

2.0.0-rc.10 Jun 1, 2025
2.0.0-rc.9 Nov 21, 2024
2.0.0-rc.8 Oct 19, 2024
2.0.0-rc.4 Jul 7, 2024
1.13.1 Nov 27, 2022

#2 in Machine learning

Download history 69492/week @ 2025-09-14 64494/week @ 2025-09-21 81151/week @ 2025-09-28 108913/week @ 2025-10-05 100654/week @ 2025-10-12 85331/week @ 2025-10-19 79959/week @ 2025-10-26 88516/week @ 2025-11-02 89651/week @ 2025-11-09 106041/week @ 2025-11-16 86824/week @ 2025-11-23 101691/week @ 2025-11-30 99878/week @ 2025-12-07

402,922 downloads per month
Used in 232 crates (137 directly)

MIT/Apache

670KB
13K SLoC


Coverage Results Crates.io Open Collective backers and sponsors
Crates.io ONNX Runtime

ort is an (unofficial) ONNX Runtime 1.22 wrapper for Rust based on the now inactive onnxruntime-rs. ONNX Runtime accelerates ML inference and training on both CPU & GPU.

📖 Documentation

🤔 Support

💖 Projects using ort

Open a PR to add your project here 🌟

  • Bloop uses ort to power their semantic code search feature.
  • edge-transformers uses ort for accelerated transformer model inference at the edge.
  • Ortex uses ort for safe ONNX Runtime bindings in Elixir.
  • Supabase uses ort to remove cold starts for their edge functions.
  • Lantern uses ort to provide embedding model inference inside Postgres.
  • Magika uses ort for content type detection.
  • sbv2-api is a fast implementation of Style-BERT-VITS2 text-to-speech using ort.
  • Ahnlich uses ort to power their AI proxy for semantic search applications.
  • Spacedrive is a cross-platform file manager with AI features powered by ort.
  • BoquilaHUB uses ort for local AI deployment in biodiversity conservation efforts.
  • FastEmbed-rs uses ort for generating vector embeddings, reranking locally.
  • Aftershoot uses ort to power AI-assisted image editing workflows.

🌠 Sponsor ort


Dependencies

~1.8–4MB
~65K SLoC