Google is committed to advancing racial equity for Black communities. See how.

Recognize Flowers with TensorFlow Lite on Android (beta)

android.png

TensorFlow is a multipurpose machine learning framework. TensorFlow can be used anywhere from training huge models across clusters in the cloud, to running models locally on an embedded system like your phone.

This codelab uses TensorFlow Lite to run an image recognition model on an Android device.

Install Android Studio 4.1 beta

If you don't have it installed already, go download and install AndroidStudio 4.1 Beta 1 or above while you are training your TensorFlow Lite model.

What you'll learn

  • How to train your own custom image classifier using TensorFlow Lite Model Maker.
  • How to use Android Studio to import the TensorFlow Lite model to integrate the custom model in an Android app using CameraX.
  • How to use GPU on your phone to accelerate your model.

What you will build

A simple camera app that runs a TensorFlow image recognition program to identify flowers.

License: Free to use

Before kicking off the model training, start downloading and installing the

Open the Colab which shows how to train a classifier with Keras to recognize flowers using TensorFlow Lite transfer learning.

Clone the Git repository

The following command will clone the Git repository containing the files required for this codelab:

git clone https://github.com/hoitab/TFLClassify.git

Next, go to the directory you just cloned the repository. This is where you will be working on for the rest of this codelab:

cd TFLClassify

android.png

Install Android Studio 4.1 beta 1

If you don't have it installed already, go install AndroidStudio 4.1 Beta 1 or above.

Open the project with Android Studio

Open a project with Android Studio by taking the following steps:

  1. Open Android Studio. After it loads select "Open an Existing project" from this popup:

  1. In the file selector, choose TFLClassify/build.gradle from your working directory.
  1. You will get a "Gradle Sync" popup, the first time you open the project, asking about using gradle wrapper. Click "OK".

  1. Enable developer model and USB Debugging on your phone if you have not already. This is a one-time set up. Follow these instructions.
  2. Once both your project and your phone is ready, you can run it on a real device by selecting TFL_Classify.start and press the run button on the toolbar:

  1. Now allow the Tensorflow Demo to access your camera:

  1. You will see the following screen on your phone with random numbers taking the place of where real results will be displayed.

  1. Select the start module in the project explorer on the left hand side:

  1. Right-click on the start module or click on File, then New > Other > TensorFlow Lite Model

  1. Select the model location where you have downloaded the custom trained FlowerModel.tflite earlier.

  1. Click Finish.
  2. You will see the following at the end. The FlowerModel.tflite is successfully imported and it shows the high level information regarding the model including the input / output as well as some sample code to get you started.

TODO list makes it easy to navigate to the exact location where you need to update the codelab. You can also use it in your Android project to remind yourself of future work. You can add todo items using code comments and type the keyword TODO. To access the list of TODOs, you can:

  1. A great way to see what we are going to do is to check out the TODO list. To do that, select from the top menu bar View > Tool Windows > TODO

  1. By default, it lists all TODOs in all modules which makes it a little confusing. We can sort out only the start TODOs by clicking on the group by button on the side of the TODO panel and choose Modules

  1. Expand all the items under the start modules:

  1. Click on TODO 1 in the TODO list or open the MainActivity.kt file and locate TODO 1, initialize the model by adding this line:
private class ImageAnalyzer(ctx: Context, private val listener: RecognitionListener) :
        ImageAnalysis.Analyzer {

  ...
  // TODO 1: Add class variable TensorFlow Lite Model
  private val flowerModel = FlowerModel.newInstance(ctx)

  ...
}
  1. Inside the analyze method for the CameraX Analyzer, we need to convert the camera input ImageProxy into a Bitmap and create a TensorImage object for the inference process.
override fun analyze(imageProxy: ImageProxy) {
  ...
  // TODO 2: Convert Image to Bitmap then to TensorImage
  val tfImage = TensorImage.fromBitmap(toBitmap(imageProxy))
  ...
}

  1. Process the image and perform the following operations on the result:
  • Descendingly sort the results by probability under the attribute score with the highest probability first.
  • Take the top k results as defined by the constant MAX_RESULT_DISPLAY. You can optionally vary the value of this variable to get more or less results.
override fun analyze(imageProxy: ImageProxy) {
  ...
  // TODO 3: Process the image using the trained model, sort and pick out the top results
  val outputs = flowerModel.process(tfImage)
      .probabilityAsCategoryList.apply {
          sortByDescending { it.score } // Sort with highest confidence first
      }.take(MAX_RESULT_DISPLAY) // take the top results

  ...
}
  1. Convert the sorted and filtered results into data objects Recognition ready to be consumed by RecyclerView via Data Binding:
override fun analyze(imageProxy: ImageProxy) {
  ...
  // TODO 4: Converting the top probability items into a list of recognitions
  for (output in outputs) {
      items.add(Recognition(output.label, output.score))
  }
  ...
}
  1. Comment out or delete the following lines which help generate the fake results we see before:
// START - Placeholder code at the start of the codelab. Comment this block of code out.
for (i in 0..MAX_RESULT_DISPLAY-1){
    items.add(Recognition("Fake label $i", Random.nextFloat()))
}
// END - Placeholder code at the start of the codelab. Comment this block of code out.
  1. Run the app on a real device by selecting TFL_Classify.start and press the run button on the toolbar:

  1. You will see the following screen on your phone with random numbers taking the place of where real results will be displayed:

TensorFlow Lite supports several hardware accelerators to speed up inference on your mobile device. GPU is one of the accelerators that TensorFlow Lite can leverage through a delegate mechanism and it is fairly easy to use.

  1. Open build.gradle under the start module or you can click on TODO 5 under the TODO list and add the following dependency:
// TODO 5: Optional GPU Delegates    
implementation 'org.tensorflow:tensorflow-lite-gpu:2.2.0'
  1. Go back to the MainActivity.kt file or click on TODO 6 in the TODO list and initialize the following model option:
private class ImageAnalyzer(ctx: Context, private val listener: RecognitionListener) :
        ImageAnalysis.Analyzer {
  ...
  // TODO 6. Optional GPU acceleration
  private val options = Model.Options.Builder().setDevice(Model.Device.GPU).build()
  ...
}
  1. Change the model initializer to use this by adding options to the method input:
private class ImageAnalyzer(ctx: Context, private val listener: RecognitionListener) :
        ImageAnalysis.Analyzer {

  ...
  // TODO 1: Add class variable TensorFlow Lite Model
  private val flowerModel = FlowerModel.newInstance(ctx, options)

  ...
}

  1. Run the app on a real device by selecting TFL_Classify.start and press the run button on the toolbar:

Here are some links for more information: