The Android XR SDK is now available in Developer Preview. We want your feedback! Visit our
support page to reach out.
Develop with OpenXR
Stay organized with collections
Save and categorize content based on your preferences.
Android XR supports apps built with OpenXR through its support
for the OpenXR 1.1 specification and select vendor extensions.
OpenXR is an open standard that lets you create immersive and interactive
experiences using a common set of APIs across a wide range of XR devices.
Features
Android XR supports features that allow you to build apps that take full
advantage of the unique capabilities of XR devices, using OpenXR. These features
include the following.
- Trackables
- Supports plane detection, which is the ability to identify and
track flat surfaces within the environment, enabling the placement of
virtual objects in relation to the real world and Anchors which are
virtual points of reference that can be attached to real-world objects or
locations, ensuring that virtual content remains accurately positioned and
oriented even as the user moves around.
- Raycasting
- A technique used to determine the intersection point between a
virtual ray and objects in the scene, facilitating interactions such as
selecting and manipulating virtual elements.
- Anchor persistence
- The capability to save and restore anchors across multiple
sessions, allowing for persistent and consistent placement of virtual
content within the environment.
- Object tracking
- The ability to track mouse, keyboard and other objects in the
real-world.
- QR Code tracking
- The ability to track QR Codes in the physical environment and decode
their data.
- Depth textures
- The generation of depth maps that provide information about the
distance between the camera and objects in the scene, enabling more
realistic occlusion and interaction effects.
- Passthrough
- The ability to blend real-world camera footage with virtual
content, creating a mixed reality experience that seamlessly combines the
physical and digital worlds.
- Scene meshing
- The ability to acquire a 3D mesh of the environment, which can be
used for physics, occlusion, and other world-aware interactions.
- Composition layer passthrough
- Allows for a polygon passthrough composition
layer cutout, can be used for bringing real world objects into a scene.
- Face tracking
- The ability to track the features of the user's face, enabling
the creation of more realistic and expressive avatars and virtual
characters.
- Eye tracking
- Provides position and orientation of the user's eye, which is
designed to make eye pose for avatars more realistic.
- Hand tracking
- The ability to track the position and movement of the user's hands.
- Hand mesh
- Provides an accurate representation of the user's hands as a low
poly mesh. Optimized for platform-to-application delivery to make sure you
get the best performance possible. This is an alternative to other
extensions which use a bind pose and blend weights.
- Light estimation
- Used for lighting models to match the user's real world lighting conditions.
Android XR also supports the following input devices.
- Hand Interaction
- The recognition of specific hand gestures, such as
pinching, swiping, and pointing, enabling the users to interact with virtual
objects using gestures and hand movements.
- Eye Gaze Interaction
- The ability to track the user's eye movements,
allowing them to select and interact with virtual objects using their gaze.
- 6DoF Motion Controllers
- The ability to track the controllers position and
movement along with Dpad and button bindings for triggering actions, or
hover events within the application.
- Mouse Interaction
- The ability for users to interact with objects through a
mouse pointer in 3D space
Android XR supports the following performance-related features.
- Eye-tracked foveation
- Allows an app to render higher resolution content only
at the eyes focal point.
- Space warp
- Uses velocity vectors and depth texture information to
generate tween frames which effectively boosts the framerate required to
keep your users immersed in your experiences
- Performance metrics
- Provides Android XR performance metrics at runtime of
the current XR device, compositor, and XR application. This includes cpu
frametime, gpu frame time, gpu utilization, cpu frequency, frames per second
and more.
See the OpenXR Feature Overview for a full list of supported features and
extensions.
Supported engines
Unity
Android XR's Unity support, built on top of OpenXR, allows developers to create
experiences using Unity 6. Learn more about building XR apps with Unity in the
Unity overview.
OpenXR™ and the OpenXR logo are trademarks owned
by The Khronos Group Inc. and are registered as a trademark in China,
the European Union, Japan and the United Kingdom.
Content and code samples on this page are subject to the licenses described in the Content License. Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.
Last updated 2025-07-28 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-07-28 UTC."],[],[],null,["# Develop with OpenXR\n\nAndroid XR supports apps built with [OpenXR](https://www.khronos.org/openxr/) through its support\nfor the [OpenXR 1.1 specification and select vendor extensions](https://registry.khronos.org/OpenXR/specs/1.1/html/xrspec.html).\nOpenXR is an open standard that lets you create immersive and interactive\nexperiences using a common set of APIs across a wide range of XR devices.\n\nFeatures\n--------\n\nAndroid XR supports features that allow you to build apps that take full\nadvantage of the unique capabilities of XR devices, using OpenXR. These features\ninclude the following.\n\nTrackables\n: Supports *plane detection* , which is the ability to identify and\n track flat surfaces within the environment, enabling the placement of\n virtual objects in relation to the real world and *Anchors* which are\n virtual points of reference that can be attached to real-world objects or\n locations, ensuring that virtual content remains accurately positioned and\n oriented even as the user moves around.\n\nRaycasting\n: A technique used to determine the intersection point between a\n virtual ray and objects in the scene, facilitating interactions such as\n selecting and manipulating virtual elements.\n\nAnchor persistence\n: The capability to save and restore anchors across multiple\n sessions, allowing for persistent and consistent placement of virtual\n content within the environment.\n\nObject tracking\n: The ability to track mouse, keyboard and other objects in the\n real-world.\n\nQR Code tracking\n: The ability to track QR Codes in the physical environment and decode\n their data.\n\nDepth textures\n: The generation of depth maps that provide information about the\n distance between the camera and objects in the scene, enabling more\n realistic occlusion and interaction effects.\n\nPassthrough\n: The ability to blend real-world camera footage with virtual\n content, creating a mixed reality experience that seamlessly combines the\n physical and digital worlds.\n\nScene meshing\n: The ability to acquire a 3D mesh of the environment, which can be\n used for physics, occlusion, and other world-aware interactions.\n\nComposition layer passthrough\n: Allows for a polygon passthrough composition\n layer cutout, can be used for bringing real world objects into a scene.\n\nFace tracking\n: The ability to track the features of the user's face, enabling\n the creation of more realistic and expressive avatars and virtual\n characters.\n\nEye tracking\n: Provides position and orientation of the user's eye, which is\n designed to make eye pose for avatars more realistic.\n\nHand tracking\n: The ability to track the position and movement of the user's hands.\n\nHand mesh\n: Provides an accurate representation of the user's hands as a low\n poly mesh. Optimized for platform-to-application delivery to make sure you\n get the best performance possible. This is an alternative to other\n extensions which use a bind pose and blend weights.\n\nLight estimation\n: Used for lighting models to match the user's real world lighting conditions.\n\nSupported input devices\n-----------------------\n\nAndroid XR also supports the following input devices.\n\nHand Interaction\n: The recognition of specific hand gestures, such as\n pinching, swiping, and pointing, enabling the users to interact with virtual\n objects using gestures and hand movements.\n\nEye Gaze Interaction\n: The ability to track the user's eye movements,\n allowing them to select and interact with virtual objects using their gaze.\n\n6DoF Motion Controllers\n: The ability to track the controllers position and\n movement along with Dpad and button bindings for triggering actions, or\n hover events within the application.\n\nMouse Interaction\n: The ability for users to interact with objects through a\n mouse pointer in 3D space\n\nSupported performance features\n------------------------------\n\nAndroid XR supports the following performance-related features.\n\nEye-tracked foveation\n: Allows an app to render higher resolution content only\n at the eyes focal point.\n\nSpace warp\n: Uses velocity vectors and depth texture information to\n generate tween frames which effectively boosts the framerate required to\n keep your users immersed in your experiences\n\nPerformance metrics\n: Provides Android XR performance metrics at runtime of\n the current XR device, compositor, and XR application. This includes cpu\n frametime, gpu frame time, gpu utilization, cpu frequency, frames per second\n and [more](/develop/xr/openxr/extensions/XR_ANDROID_performance_metrics).\n\nSee the [OpenXR Feature Overview](/develop/xr/openxr/extensions) for a full list of supported features and\nextensions.\n\nSupported engines\n-----------------\n\n| **Note:** The [Android XR emulator](/develop/xr/jetpack-xr-sdk/studio-tools#android-xr) is not supported for Unity or OpenXR apps.\n\nUnity\n-----\n\nAndroid XR's Unity support, built on top of OpenXR, allows developers to create\nexperiences using Unity 6. Learn more about building XR apps with Unity in the\n[Unity overview](/develop/xr/unity).\n\n*** ** * ** ***\n\nOpenXR™ and the OpenXR logo are trademarks owned\nby The Khronos Group Inc. and are registered as a trademark in China,\nthe European Union, Japan and the United Kingdom."]]