Add spatial video to your app

The Jetpack XR SDK supports the playback of stereoscopic side-by-side video onto flat surfaces. With stereoscopic video, each frame consists of a left-eye and a right-eye image to give viewers a sense of depth.

You can render non-stereoscopic 2D video on Android XR apps with the standard media APIs used for Android development on other form factors.

Play side-by-side video using the Jetpack XR SDK

With side-by-side video, each stereoscopic frame is presented as two images arranged horizontally adjacent to each other. Top-and-bottom video frames are arranged vertically adjacent to each other.

Side-by-side video is not a codec but rather a way of organizing stereoscopic frames, which means it can be encoded in any of the codecs supported by Android.

Jetpack SceneCore

You can load side-by-side video using Media3 Exoplayer and then render it using the new SurfaceEntity. To create a SurfaceEntity, call SurfaceEntity.create, as shown in the following example.

val stereoSurfaceEntity = SurfaceEntity.create(
    xrSession,
    SurfaceEntity.StereoMode.SIDE_BY_SIDE,
    Pose(Vector3(0.0f, 0.0f, -1.5f)),
    SurfaceEntity.CanvasShape.Quad(1.0f, 1.0f)
)
val videoUri = Uri.Builder()
    .scheme(ContentResolver.SCHEME_ANDROID_RESOURCE)
    .path("sbs_video.mp4")
    .build()
val mediaItem = MediaItem.fromUri(videoUri)

val exoPlayer = ExoPlayer.Builder(this).build()
exoPlayer.setVideoSurface(stereoSurfaceEntity.getSurface())
exoPlayer.setMediaItem(mediaItem)
exoPlayer.prepare()
exoPlayer.play()

Jetpack Compose for XR

Alpha04 and higher

Starting in Version 1.0.0-alpha04, Jetpack Compose for XR offers another way to load and render side-by-side video. Use SpatialExternalSurface, a subspace composable that creates and manages the Surface into which the app can draw content, such as an image or video. For more details on SpatialExternalSurface see the Develop UI with Compose for XR guide.

This example demonstrates how to load side-by-side video using Media3 Exoplayer and SpatialExternalSurface.

@Composable
fun SpatialExternalSurfaceContent() {
    val context = LocalContext.current
    Subspace {
        SpatialExternalSurface(
            modifier = SubspaceModifier
                .width(1200.dp) // Default width is 400.dp if no width modifier is specified
                .height(676.dp), // Default height is 400.dp if no height modifier is specified
            // Use StereoMode.Mono, StereoMode.SideBySide, or StereoMode.TopBottom, depending
            // upon which type of content you are rendering: monoscopic content, side-by-side stereo
            // content, or top-bottom stereo content
            stereoMode = StereoMode.SideBySide,
        ) {
            val exoPlayer = remember { ExoPlayer.Builder(context).build() }
            val videoUri = Uri.Builder()
                .scheme(ContentResolver.SCHEME_ANDROID_RESOURCE)
                // Represents a side-by-side stereo video, where each frame contains a pair of
                // video frames arranged side-by-side. The frame on the left represents the left
                // eye view, and the frame on the right represents the right eye view.
                .path("sbs_video.mp4")
                .build()
            val mediaItem = MediaItem.fromUri(videoUri)

            // onSurfaceCreated is invoked only one time, when the Surface is created
            onSurfaceCreated { surface ->
                exoPlayer.setVideoSurface(surface)
                exoPlayer.setMediaItem(mediaItem)
                exoPlayer.prepare()
                exoPlayer.play()
            }
            // onSurfaceDestroyed is invoked when the SpatialExternalSurface composable and its
            // associated Surface are destroyed
            onSurfaceDestroyed { exoPlayer.release() }
        }
    }
}

Play 180-degree and 360-degree video using the Jetpack XR SDK

SurfaceEntity supports playback of 180° videos on hemispherical surfaces and 360° videos on spherical surfaces. If the videos are stereoscopic, the files should be in a side-by-side format.

The following code shows how to set up SurfaceEntity for playback on a 180° hemisphere and a 360° sphere. When using these canvas shapes, position the surface by leveraging the user's head pose to provide an immersive experience.

// Set up the surface for playing a 180° video on a hemisphere.
val hemisphereStereoSurfaceEntity =
    SurfaceEntity.create(
        xrSession,
        SurfaceEntity.StereoMode.SIDE_BY_SIDE,
        xrSession.scene.spatialUser.head?.transformPoseTo(
            Pose.Identity,
            xrSession.scene.activitySpace
        )!!,
        SurfaceEntity.CanvasShape.Vr180Hemisphere(1.0f),
    )
// ... and use the surface for playing the media.

// Set up the surface for playing a 360° video on a sphere.
val sphereStereoSurfaceEntity =
    SurfaceEntity.create(
        xrSession,
        SurfaceEntity.StereoMode.TOP_BOTTOM,
        xrSession.scene.spatialUser.head?.transformPoseTo(
            Pose.Identity,
            xrSession.scene.activitySpace
        )!!,
        SurfaceEntity.CanvasShape.Vr360Sphere(1.0f),
    )
// ... and use the surface for playing the media.