Create smarter apps
On-device machine learning (ML) lets you supercharge your app and add features to process images, sound, and text.
You can add on-device machine learning features to your app, whether you are a seasoned developer or just getting started.
Keep data on-device
Run Gemini on the server
TensorFlow Lite for ML runtime: Use TensorFlow Lite via Google Play services, Android’s official ML inference runtime, to run high-performance ML inference in your app. Learn more
Hardware Acceleration with TensorFlow Lite Delegates: Use TensorFlow Lite Delegates distributed via Google Play services to run accelerated ML on specialized hardware such as GPU, NPU, or DSP. This may help you deliver more fluid, lower latency user experiences to your users by accessing advanced on-device compute capabilities.