Register now for Android Dev Summit 2019!

Build smarter apps with machine learning

Machine learning (ML) is a programming technique that provides your apps the ability to automatically learn and improve from experience without being explicitly programmed to do so. This is especially well-suited for apps that utilize unstructured data such as images and text, or problems with large number of parameters such as predicting the winning sports team.

Android supports a wide variety of machine learning tools and methods. Whether you’re an experienced Android developer, or just starting out, here are some ML resources to help you get the best results.

Key ML Development Areas

As an Android Developer, you will focus on inference and deployment of ML models. Depending on your own circumstances, you may also be involved in building and training models.

Design for Machine Learning

Similar to other technologies, applying machine learning as a solution requires product managers, designers and developers to work together to define product goals, design, build and iterate. Google has produced two guides in this area:

Build and Train a Model

Machine learning requires a model that's trained to perform a particular task, like making a prediction, or classifying or recognizing some input. You can select (and possibly customize) an existing model, or build a model from scratch. Model creation and training can be done on a development machine, or using cloud infrastructure.

Explore pre-trained models

Pre-trained models are available in ML Kit and Google Cloud. Read more about them in the next section.

Learn how to create your own models with TensorFlow

For a deeper hands-on development experience, you can use these TensorFlow resources:

Inference

Inference is the process of using a machine learning model that has already been trained to perform a specific task.

A key decision you’ll face as an Android developer is whether inferencing runs on the device, or uses a cloud service that's accessed remotely. Here are some of the factors you should take into account when making this decision:

On-device inferenceCloud-based inference
Latency Lower latency enhances the realtime experience Asynchronous communication and available bandwidth can affect latency
Resources The particular device's resources, like processing power and storage, can limit performance Cloud-based resources are more powerful and storage is more plentiful
Offline/Online The ability to operate offline is a plus for running with poor or non-existing network infrastructure A network connection is required
Cost Battery usage, model download time for end users Bandwidth for data transfer for end users, computing charges for developers
Privacy User data never leaves the device Data may leave the device, additional precautions may be necessary

The following table shows the available development options for each kind of inference:

On-device inferenceCloud inference

ML Kit

Google Cloud

Train your own custom vision model on Google Cloud and run the resulting model on Android and other edge devices:

TensorFlow Lite

TensorFlow Lite can be used to deliver a trained TensorFlow model as an on-device solution:

ML Kit

Google Cloud APIs

Deployment

Deployment is the process of packaging and updating your model for use on Android when doing on-device inference. There are three options available:

Packaging ML Model with your Android app
Your model will be deployed with your app like any other asset. Update to the model will require an update to the app. There are two ways you can package your ML Model with your app:
Providing the model at runtime
This enables you to update your model independently of your app. This also makes A/B testing easier. You can serve your custom model using the ML Kit Custom Models function or host the model download with your own infrastructure.
A combination of both
It is not unusual for developers to package an initial version of their model with their Android app so that user does not need to wait for their model to download while updating the model to a new version.

For selected pre-trained ML Kit models, namely text recognition and barcode scanning, developers can use the shared model provided by Google Play Services resulting in smaller APK sizes.

Developer Stories

Making the impossible possible

Adding ML to your Android app opens up a new way to build applications that were too difficult to get right in a wide variety of conditions (such as reliable barcode scanning) or that were not even previously possible (for example, image detection and text sentiment).

Feature

Lose It!

Lose It! is a weight loss calorie tracker app. It helps you lose weight by logging all the food you eat so you know how many calories you have consumed. Lose It! uses the ML Kit text recognition API to scan nutrition labels to pull in the data when users are entering a new food that isn’t in their library.

Feature

PlantVillage

PlantVillage helps farmers detect diseases in Cassava Plants. Penn State University and the International Institute of Tropical Agriculture uses their custom TensorFlow models running offline on mobile devices to help farmers detect early sign of plant diseases.

Feature

Fishbrain

The Fishbrain app provides local fishing maps, forecasts, and connects millions of anglers. Fishbrain uses ML Kit Custom Model to deliver updated custom TensorFlow Lite models.