Build your own custom MLOps orchestration workflows from composable automation recipes adapted to your favorite AI/ML tools, to get you from ML code to inference serving in production as fast as lighting a fuse.
Use FuseML to build a coherent stack of community shared AI/ML tools to run your ML operations. FuseML is powered by a flexible framework designed for consistent operations and a rich collection of integration formulas reflecting real world use cases that help you reduce technical debt and avoid vendor lock-in.
- Curious to find out more ? Read the FuseML Documentation
- Follow our quickstart guide and have your first MLOps workflow up and running in no time
- Join our community Slack channel to ask questions and receive announcements about upcoming features and releases
- Contemplating becoming a contributor ? Find out how
- Watch some of the FuseML Tutorial and Talk Videos or recorded Communitiy Meetings
FuseML originated as a fork of our sister open source project Epinio, a lightweight open source PaaS built on top of Kubernetes, then has been gradually transformed and infused with the MLOps concepts that make it the AI/ML orchestration tool that it is today.
The project is under heavy development following the main directions:
- adding features and enhancements to improve flexibility and extensibility
- adding support for more community shared AI/ML tools
- creating more composable automation blocks adapted to the existing as well as new AI/ML tools
Take a look at our Project Board to see what we're working on and what's in store for the next release.
The basic FuseML workflow can be described as an MLOps type of workflow that starts with your ML code and automatically runs all the steps necessary to build and serve your machine learning model. FuseML's job begins when your machine learning code is ready for execution.
- install FuseML in a kubernetes cluster of your choice (see Installation Instructions)
- write your code using the AI/ML library of your choice (e.g. TensorFlow, PyTorch, SKLearn, XGBoost)
- organize your code using one of the conventions and experiment tracking tools supported by FuseML
- use the FuseML CLI to push your code to the FuseML Orchestrator instance and, optionally, supply parameters to customize the end-to-end MLOps workflow
- from this point onward, the process is completely automated: FuseML takes care of all aspects that involve building and packaging code, creating container images, running training jobs, storing and converting ML models in the right format and finally serving those models
- MLFlow
- DVC (TBD)
- MLFlow
- DeterminedAI (TBD)
- KServe
- Seldon Core (coming soon)
- Knative Serving (coming soon)
This repository contains the code for the FuseML installer and is the main project repository. Other repositories of interest are:
- https://github.com/fuseml/fuseml-core - the FuseML core service and CLI
- https://github.com/fuseml/examples - collection of sample ML applications and workflows that can be used to showcase FuseML capabilities
- https://github.com/fuseml/extensions - collection of FuseML extensions adapting 3rd party AI/ML tools to the FuseML MLOps workflow mechanisms
- https://github.com/fuseml/docs - main documentation for the project
- https://github.com/fuseml/fuseml.github.io - sources for the FuseML website
- https://github.com/fuseml/wiki - staging for the wiki pages