On June 3, 2020, we made some changes to ML Kit for Firebase to better distinguish the on-device APIs from cloud based APIs. The current set of APIs is now split into the following two products:
-
A new product, simply called ML Kit, which will contain all the on-device APIs
-
Firebase Machine Learning, focused on cloud-based APIs and custom model deployment.
This change will also make it easier to integrate ML Kit into your app if you only need an on-device solution. This document explains how to migrate your app from the Firebase ML Kit SDK to the new ML Kit SDK.
What's changing?
On-device base APIs
The following APIs have moved to the new standalone ML Kit SDK.
- Barcode scanning
- Face detection
- Image labeling
- Object detection and tracking
- Text recognition
- Language ID
- Smart reply
- Translate
- AutoML Vision Edge inference API
The existing on-device base APIs in the ML Kit for Firebase SDK are deprecated and will no longer receive updates.
If you are using these APIs in your app today, please migrate to the new ML Kit SDK, by following the ML Kit migration guide for Android and the ML Kit migration guide for iOS.
Custom model APIs
For downloading models hosted in Firebase, the custom model downloader continues to be offered through the Firebase ML SDK. The SDK fetches the latest available model and passes it to the separate TensorFlow Lite runtime for inference.
The existing custom model interpreter in the ML Kit for Firebase SDK is deprecated and will no longer receive updates. We recommend using the TensorFlow Lite runtime directly for inference. Alternatively, if you only want to use custom models for image labeling and object detection and tracking APIs, you can now use custom models in these APIs in ML Kit directly.
See the migration guides for Android and iOS for detailed instructions.
What hasn't changed?
Cloud-based APIs and services will continue to be offered with Firebase ML:
The cloud-based image labeling, text recognition, and landmark recognition APIs are still available from the Firebase ML SDK.
Firebase ML also continues to offer Model deployment
Frequently asked questions
Why this change?
We are making this change to clarify what solutions the product is offering. With this change, the new ML Kit SDK is fully focused on on-device machine learning where all data processing happens on-device and is available to developers at no cost. The cloud services that were part of Firebase ML Kit before remain available through Firebase ML and you can still use these in parallel with ML Kit APIs.
For on-device APIs, the new ML Kit SDK makes it easier for developers to integrate ML Kit into their app. Going forward, you just need to add dependencies to the app’s project and then start using the API. There is no need to set up a Firebase project just to use on-device APIs.
What happens to my models that are being hosted with Firebase?
Firebase Machine Learning will continue to serve your models as before. That functionality isn’t changing. Here are a couple of improvements:
You can now deploy your models to Firebase programmatically using the Python or Node SDKs.
You can now use the Firebase ML SDK in conjunction with the TensorFlow Lite runtime. The Firebase SDK downloads the model to the device, and the TensorFlow Lite runtime performs the inference. This allows you to easily choose the runtime version you prefer, including a custom build.
What benefits do I get from migrating to the new ML Kit SDK?
Migrating to the new SDK will ensure your applications benefit from the latest bug fixes and improvements to the on-device APIs. For example, here are a couple of changes in the first release:
You can now use the new custom image labeling and custom object detection and tracking APIs to easily integrate custom image classification models in your apps and build real-time interactive user experiences.
Android Jetpack Lifecycle support is added to all APIs. You can now use
addObserver
to automatically manage the initiation and teardown of ML Kit APIs as the app goes through screen rotation or closure by the user / system. This makes integration with CameraX easier.
A full list of the latest changes can be found in the ML Kit SDK release notes.
I am using ML Kit for Firebase today, when do I need to migrate over?
This depends on which ML Kit for Firebase APIs you currently use in your app.
The on-device base APIs in the ML Kit for Firebase SDK will continue to work for the foreseeable future. However, by delaying the switch to the new ML Kit SDK, you will not benefit from new features and updates. In addition, once you update other components in your app there is a risk you may run into dependencies conflicts. This can happen when some of your other dependencies (direct or indirect) are newer than the ones expected by the old ML Kit for Firebase SDK. Examples of libraries for which this may happen are OkHttp and firebase-common.
If you are using Cloud APIs via the ML Kit for Firebase SDK, no change is required at this time.
If you are using Custom Model Deployment, we recommend you upgrade to the latest version which allows running inferences directly on the TensorFlow Lite runtime.