[[["易于理解","easyToUnderstand","thumb-up"],["解决了我的问题","solvedMyProblem","thumb-up"],["其他","otherUp","thumb-up"]],[["没有我需要的信息","missingTheInformationINeed","thumb-down"],["太复杂/步骤太多","tooComplicatedTooManySteps","thumb-down"],["内容需要更新","outOfDate","thumb-down"],["翻译问题","translationIssue","thumb-down"],["示例/代码问题","samplesCodeIssue","thumb-down"],["其他","otherDown","thumb-down"]],["最后更新时间 (UTC):2025-07-27。"],[],[],null,["# Foundations\n\nThe Android XR system uses interactivity models similar to those in mobile and\nlarge-screen apps to help users understand how to use XR. It includes known\npatterns like the home screen, apps overview, back stack, and more.\n\nTo help you build integrated and boundless experiences, Android XR provides\nnatural gesture navigation, multimodal inputs, and new spatial and 3D\ncapabilities.\n\nHome Space and Full Space modes\n-------------------------------\n\nA user can experience your app in two modes, Home Space and Full Space. In Home\nSpace, a user is able to multitask with your app running side by side with other\napps. In Full Space, your app takes center stage as the focus of the user's\nexperience with full access to the immersive capabilities of Android XR.\n| **Note:** Spatial capabilities can change as users interact with your app or the system. To avoid issues, your app should [check for spatial capabilities](/develop/xr/jetpack-xr-sdk/check-spatial-capabilities) to determine what's supported.\n\n\n**Home Space**\n\n- Multiple apps run side by side so users can multitask.\n- Any [compatible](/develop/xr/get-started) mobile or large screen Android app can operate in Home Space with no additional development.\n- Android apps developed with [large screen-optimized guidance](/guide/topics/large-screens/tier-2-overview) adapt best.\n- Home Space supports system environments. It does not support [spatial\n panels](/design/ui/xr/guides/spatial-ui), [3D models](/design/ui/xr/guides/3d-content), or an app's [spatial environments](/design/ui/xr/guides/environments).\n- Apps have constrained boundaries.\n- Default size: 1024 x 720dp\n- Minimum size 385 x 595dp, maximum 2560 x 1800dp\n- Apps launch 1.75 meters from a user. \n\n**Full Space**\n\n- One app runs at a time, with no space boundaries. All other apps are hidden.\n- You can [spatialize an existing Android app](/develop/xr/jetpack-xr-sdk/add-xr-to-existing) in Full Space.\n- You can add [spatial panels](/design/ui/xr/guides/spatial-ui), [3D models](/design/ui/xr/guides/3d-content), [spatial environments](/design/ui/xr/guides/environments), or spatial audio to take advantage of the space.\n- Play stereoscopic [spatial videos](/design/ui/xr/guides/spatial-ui#spatial-videos).\n- Apps can overwrite the [launch position](/design/ui/xr/guides/visual-design#panel-depth) and have move and resize capabilities.\n- Apps can open directly into Full Space.\n- [Unity](https://docs.unity3d.com/Manual/index.html), [OpenXR](https://registry.khronos.org/OpenXR/specs/1.0/styleguide.html), and [WebXR](https://immersiveweb.dev/) apps operate in an unmanaged Full Space. Refer to each platform's documentation for specific interaction capabilities.\n\n\u003cbr /\u003e\n\nAlas, your browser doesn't support HTML5 video. That's OK! You can still [download the video](/static/videos/design/ui/xr/xr-foundations-3-opt.mp4) and watch it with a video player. \n**Recommendation** : Add clear visual cues to let users quickly switch between Full Space and Home Space. For example, you can use [collapse](https://fonts.google.com/icons?icon.query=collapse) and [expand](https://fonts.google.com/icons?icon.query=expand) icons for buttons to trigger transitions.\n\nGive users control over their environment\n-----------------------------------------\n\nIn Android XR, an environment is the real or virtual space that a user sees\nwhile wearing an XR device. It is unconstrained by the physical limitations of\nmobile and desktop screens.\n\n- A spatial environment simulates a fully immersive virtual space that takes over a user's physical space. Available in Full Space only. For example, a user watches a movie in a virtual luxury cinema.\n- A passthrough environment adds digital elements to a user's physical surroundings. For example, a user opens multiple large-screen apps while simultaneously seeing their real-life room.\n\n[Learn how to build spatial environments in Full Space](/design/ui/xr/guides/environments).\n\n### System environments\n\nUsers can choose environments provided by the Android XR system. These system\nenvironments can be used in Home Space or Full Space. If an app doesn't define a\nspecific environment, it will inherit the system environment --- either in\npassthrough or a virtual environment.\n\nUnderstanding system gestures\n-----------------------------\n\nAndroid XR extends familiar mobile actions like press, pinch, and swipe to a\ngesture-based navigation system.\n\nItems are selected by pinching with the index finger and thumb on the primary\nhand, which is the spatial equivalent of tapping on a touchscreen or pressing a\nmouse button. A held pinch is used to scroll, move or resize windows, and select\nand move UI elements or objects in 2D and 3D space. \nAlas, your browser doesn't support HTML5 video. That's OK! You can still [download the video](/static/videos/design/ui/xr/Input9.mp4) and watch it with a video player. \nA user selects items by pinching with the index finger and thumb on the primary hand.\n\nUsers navigate by facing their primary hand's palm inward, pinch and holding\ntheir index finger and thumb. Their hand moves up, down, left, or right, and\nreleases to select an option. Users can set their primary hand preference in\n**Input Settings**. \nAlas, your browser doesn't support HTML5 video. That's OK! You can still [download the video](/static/videos/design/ui/xr/xr-foundations-5.webm) and watch it with a video player. \nUsers can open the gesture navigation menu anywhere, anytime to:\n\n- **Go back** : Operates the same as the [back stack](/guide/components/activities/tasks-and-back-stack) on Android mobile, returning to the previous item.\n- **Launcher**: Takes users to the home screen.\n- **Recents**: Users can open, close, and switch apps.\n\nDesign with multimodal inputs\n-----------------------------\n\nIt's essential to design immersive applications that are accessible to a wide\nrange of users. You should allow users to customize input methods to suit their\nindividual preferences and abilities.\n\nTo help you achieve this, Android XR supports a variety of input methods,\nincluding hand and eye tracking, voice commands, Bluetooth-connected keyboards,\ntraditional and adaptive mice, trackpads, and six degrees of freedom (6DoF)\ncontrollers. Your app should automatically work with these built-in modalities.\n\nMake sure you provide visual or audio feedback to confirm user actions for any\ninteraction model you choose.\n\n[Learn about design considerations for XR accessibility](/design/ui/xr/guides/get-started#make-app).\n\n**Hand tracking enables natural interactions**. When developing OpenXR apps, you\ncan request permission from the system to access hand tracking directly and\ninclude your own custom gestures. These should be designed to be easy to learn,\nremember, and perform comfortably.\n\nWhen designing gestures, keep in mind that they should be comfortable to perform\nrepeatedly, and not require large hand movements or frequent arm lifting, which\ncan be fatiguing. If you add virtual hands, ensure they are accurately tracked.\n\nYou can also design gestures that mimic real-world actions, such as picking up\nor throwing. Using familiar gestures may help users understand interactions more\nquickly.\n\nBe aware that similarity to system gestures can lead to conflicts or accidental\nactivation of system functions.\n\n**Voice commands are useful for hands-free interaction**. Users can dictate text\ninputs and perform some app interactions with spoken instructions through\nGemini. For example, a user might say \"Open Google Maps\" to open that app.\n\n**Eye tracking enables effortless interactions**, such as selecting objects by\nlooking at them. To minimize eye strain you can offer alternative input methods.\n| **Note:** To protect user privacy, Android XR doesn't share raw eye tracking data with apps. Instead, the system displays a generic hover effect when it detects a user is looking at an interactable element, using information from the Android UI framework.\n\n**Peripheral devices**. Android XR supports external devices like a Bluetooth\nkeyboard, mouse, and 6DoF controller. For controllers, ensure intuitive button\nmappings, and consider allowing users to remap buttons to suit their\npreferences.\n\nPrivacy considerations\n----------------------\n\n[Android's privacy recommendations](/privacy-and-security/about) apply to building XR apps. Remember to\nobtain user consent before collecting any personal identifiable information,\nlimit user data collection to the essentials, and store it securely.\n\n[Follow Android XR's app quality guidelines](/docs/quality-guidelines/android-xr).\n\n*** ** * ** ***\n\nOpenXR™ and the OpenXR logo are trademarks owned\nby The Khronos Group Inc. and are registered as a trademark in China,\nthe European Union, Japan and the United Kingdom."]]