DefaultVideoFrameProcessor


@UnstableApi
class DefaultVideoFrameProcessor : VideoFrameProcessor


A VideoFrameProcessor implementation that applies GlEffect instances using OpenGL on a background thread.

When using surface input (INPUT_TYPE_SURFACE or INPUT_TYPE_SURFACE_AUTOMATIC_FRAME_REGISTRATION) the surface's format must be supported for sampling as an external texture in OpenGL. When a android.media.MediaCodec decoder is writing to the input surface, the default SDR color format is supported. When an is writing to the input surface, RGBA_8888 is supported for SDR data. Support for other formats may be device-dependent.

Summary

Nested types

A factory for DefaultVideoFrameProcessor instances.

A builder for DefaultVideoFrameProcessor.Factory instances.

Releases the output information stored for textures before and at presentationTimeUs.

@Documented
@Retention(value = RetentionPolicy.SOURCE)
@Target(value = TYPE_USE)
@IntDef(value = )
annotation DefaultVideoFrameProcessor.WorkingColorSpace

Specifies the color space that frames passed to intermediate GlShaderPrograms will be represented in.

Constants

const Int

Use BT709 color primaries with the standard SDR transfer function (SMPTE 170m) as the working color space.

const Int

The working color space will have the same primaries as the input and a linear transfer function.

const Int

Use the original color space of the input as the working color space when the input is SDR.

Public functions

Unit

Flushes the VideoFrameProcessor}.

Surface!

Returns the input Surface, where VideoFrameProcessor consumes input frames from.

Int

Returns the number of input frames that have been made available to the VideoFrameProcessor but have not been processed yet.

VideoFrameProcessingTaskExecutor!

Returns the task executor that runs video frame processing tasks.

Boolean
queueInputBitmap(
    inputBitmap: Bitmap!,
    timestampIterator: TimestampIterator!
)

Provides an input Bitmap to the VideoFrameProcessor.

Boolean
queueInputTexture(textureId: Int, presentationTimeUs: Long)

Provides an input texture ID to the VideoFrameProcessor.

Boolean

Informs the VideoFrameProcessor that a frame will be queued to its input surface.

Unit
registerInputStream(
    @VideoFrameProcessor.InputType inputType: Int,
    effects: (Mutable)List<Effect!>!,
    frameInfo: FrameInfo!
)

Informs the VideoFrameProcessor} that a new input stream will be queued with the list of Effects to apply to the new input stream.

Unit

Releases all resources.

Unit
renderOutputFrame(renderTimeNs: Long)

Renders the oldest unrendered output frame that has become * Listener#onOutputFrameAvailableForRendering(long) available for rendering at the given *renderTimeNs.

Unit
setInputDefaultBufferSize(width: Int, height: Int)

This function is deprecated.

Set the input type to INPUT_TYPE_SURFACE_AUTOMATIC_FRAME_REGISTRATION instead, which sets the default buffer size automatically based on the registered frame info.

Unit

Sets the OnInputFrameProcessedListener.

Unit

Sets a listener that's called when the input surface is ready to use.

Unit
setOutputSurfaceInfo(outputSurfaceInfo: SurfaceInfo?)

Sets the output surface and supporting information.

Unit

Informs the VideoFrameProcessor that no further input frames should be accepted.

Inherited Constants

From androidx.media3.common.VideoFrameProcessor
const Long

Indicates the frame should be dropped after renderOutputFrame is invoked.

const Int

Input frames come from a Bitmap.

const Int

Input frames come from a surface.

const Int

Input frames come from the input surface and don't need to be registered (unlike with INPUT_TYPE_SURFACE).

const Int

Input frames come from a traditional GLES texture.

const Long

Indicates the frame should be rendered immediately after renderOutputFrame is invoked.

const Long

Indicates the frame should preserve the input presentation time when renderOutputFrame is invoked.

Constants

WORKING_COLOR_SPACE_DEFAULT

const val WORKING_COLOR_SPACE_DEFAULT = 0: Int

Use BT709 color primaries with the standard SDR transfer function (SMPTE 170m) as the working color space.

Any SDR content in a different color space will be transferred to this one.

WORKING_COLOR_SPACE_LINEAR

const val WORKING_COLOR_SPACE_LINEAR = 2: Int

The working color space will have the same primaries as the input and a linear transfer function.

This option is not recommended for SDR content since it may lead to color banding since 8-bit colors are used in SDR processing. It may also cause effects that modify a frame's output colors (for example overlays) to have incorrect output colors.

WORKING_COLOR_SPACE_ORIGINAL

const val WORKING_COLOR_SPACE_ORIGINAL = 1: Int

Use the original color space of the input as the working color space when the input is SDR.

Tonemapped HDR content will be represented with BT709 color primaries and the standard SDR transfer function (SMPTE 170m).

No color transfers will be applied when the input is SDR.

Public functions

flush

fun flush(): Unit

Flushes the VideoFrameProcessor}.

All the frames that are #registerInputFrame() registered prior to calling this method are no longer considered to be registered when this method returns.

methods invoked prior to calling this method should be ignored.

The downstream frame consumer must be flushed before this instance is flushed, and stop accepting input until this DefaultVideoFrameProcessor instance finishes flushing.

After this method is called, any object consuming texture output must not access any output textures that were rendered before calling this method.

getInputSurface

fun getInputSurface(): Surface!

Returns the input Surface, where VideoFrameProcessor consumes input frames from.

The frames arriving on the Surface will not be consumed by the VideoFrameProcessor until registerInputStream is called with INPUT_TYPE_SURFACE.

For streams with INPUT_TYPE_SURFACE, the returned surface is ready to use immediately and will not have a default buffer size set on it. This is suitable for configuring a decoder.

For streams with INPUT_TYPE_SURFACE_AUTOMATIC_FRAME_REGISTRATION, set a listener for the surface becoming ready via setOnInputSurfaceReadyListener and wait for the event before using the returned surface. This is suitable for use with non-decoder producers like media projection.

Throws
java.lang.UnsupportedOperationException

If the VideoFrameProcessor does not accept surface input.

getPendingInputFrameCount

fun getPendingInputFrameCount(): Int

Returns the number of input frames that have been made available to the VideoFrameProcessor but have not been processed yet.

getTaskExecutor

@VisibleForTesting
fun getTaskExecutor(): VideoFrameProcessingTaskExecutor!

Returns the task executor that runs video frame processing tasks.

queueInputBitmap

fun queueInputBitmap(
    inputBitmap: Bitmap!,
    timestampIterator: TimestampIterator!
): Boolean

Provides an input Bitmap to the VideoFrameProcessor.

Can be called many times after registering the input stream to put multiple frames in the same input stream.

Parameters
inputBitmap: Bitmap!

The Bitmap queued to the VideoFrameProcessor.

timestampIterator: TimestampIterator!

A TimestampIterator generating the exact timestamps that the bitmap should be shown at.

Returns
Boolean

Whether the Bitmap was successfully queued. A return value of false indicates the VideoFrameProcessor is not ready to accept input.

Throws
java.lang.UnsupportedOperationException

If the VideoFrameProcessor does not accept bitmap input.

queueInputTexture

fun queueInputTexture(textureId: Int, presentationTimeUs: Long): Boolean

Provides an input texture ID to the VideoFrameProcessor.

It must be only called after setOnInputFrameProcessedListener and registerInputStream have been called.

Parameters
textureId: Int

The ID of the texture queued to the VideoFrameProcessor.

presentationTimeUs: Long

The presentation time of the queued texture, in microseconds.

Returns
Boolean

Whether the texture was successfully queued. A return value of false indicates the VideoFrameProcessor is not ready to accept input.

registerInputFrame

fun registerInputFrame(): Boolean

Informs the VideoFrameProcessor that a frame will be queued to its input surface.

Must be called before rendering a frame to the input surface. The caller must not render frames to the input surface when false is returned.

Returns
Boolean

Whether the input frame was successfully registered. If registerInputStream is called, this method returns false until onInputStreamRegistered is called. Otherwise, a return value of false indicates the VideoFrameProcessor is not ready to accept input.

Throws
java.lang.UnsupportedOperationException

If the VideoFrameProcessor does not accept surface input.

java.lang.IllegalStateException

If called after signalEndOfInput or before registerInputStream.

registerInputStream

fun registerInputStream(
    @VideoFrameProcessor.InputType inputType: Int,
    effects: (Mutable)List<Effect!>!,
    frameInfo: FrameInfo!
): Unit

Informs the VideoFrameProcessor} that a new input stream will be queued with the list of Effects to apply to the new input stream.

After registering the first input stream, this method must only be called after the last frame of the already-registered input stream has been #registerInputFrame registered, last bitmap #queueInputBitmap queued or last texture id * #queueInputTexture queued.

This method blocks the calling thread until the previous calls to this method finish, that is when Listener#onInputStreamRegistered(int, List, FrameInfo) is called after the underlying processing pipeline has been adapted to the registered input stream.

Using HDR colorInfo requires OpenGL ES 3.0 and the EXT_YUV_target OpenGL extension.

Effects are applied on COLOR_RANGE_FULL colors with nullhdrStaticInfo.

If either colorInfo or outputColorInfoisTransferHdr are HDR}, textures will use GL_RGBA16F and GL_HALF_FLOAT. Otherwise, textures will use GL_RGBA and GL_UNSIGNED_BYTE.

If input coloris HDR, but outputColorInfo is SDR, then HDR to SDR tone-mapping is applied, and outputColorInfo's colorTransfer must be COLOR_TRANSFER_GAMMA_2_2 or COLOR_TRANSFER_SDR. In this case, the actual output transfer function will be in COLOR_TRANSFER_GAMMA_2_2, for consistency with other tone-mapping and color behavior in the Android ecosystem (for example, MediaFormat's COLOR_TRANSFER_SDR_VIDEO is defined as SMPTE 170M, but most OEMs process it as Gamma 2.2).

release

fun release(): Unit

Releases all resources.

If the VideoFrameProcessor is released before it has ended, it will attempt to cancel processing any input frames that have already become available. Input frames that become available after release are ignored.

This method blocks until all resources are released or releasing times out.

This VideoFrameProcessor instance must not be used after this method is called.

renderOutputFrame

fun renderOutputFrame(renderTimeNs: Long): Unit

Renders the oldest unrendered output frame that has become * Listener#onOutputFrameAvailableForRendering(long) available for rendering at the given *renderTimeNs.

This will either render the output frame to the #setOutputSurfaceInfo output surface, or drop the frame, per renderTimeNs}.

This method must only be called if renderFramesAutomatically} was set to *false using the and should be called exactly once for each frame that becomes Listener#onOutputFrameAvailableForRendering(long) available for rendering.

The renderTimeNs} may be passed to EGLExt#eglPresentationTimeANDROID depending on the implementation.

If texture output is set, calling this method will be a no-op.

setInputDefaultBufferSize

fun setInputDefaultBufferSize(width: Int, height: Int): Unit

Sets the default size for input buffers, for the case where the producer providing input does not override the buffer size.

When input comes from a media codec it's not necessary to call this method because the codec (producer) sets the buffer size automatically. For the case where input comes from CameraX, call this method after instantiation to ensure that buffers are handled at full resolution. See setDefaultBufferSize for more information.

This method must only be called when the VideoFrameProcessor is created with INPUT_TYPE_SURFACE.

Parameters
width: Int

The default width for input buffers, in pixels.

height: Int

The default height for input buffers, in pixels.

setOnInputFrameProcessedListener

fun setOnInputFrameProcessedListener(
    listener: OnInputFrameProcessedListener!
): Unit

Sets the OnInputFrameProcessedListener.

setOnInputSurfaceReadyListener

fun setOnInputSurfaceReadyListener(listener: Runnable!): Unit

Sets a listener that's called when the input surface is ready to use.

setOutputSurfaceInfo

fun setOutputSurfaceInfo(outputSurfaceInfo: SurfaceInfo?): Unit

Sets the output surface and supporting information. When output frames are rendered and not dropped, they will be rendered to this output .

The new output is applied from the next output frame rendered onwards. If the output is null}, the VideoFrameProcessor} will stop rendering pending frames and resume rendering once a non-null is set.

If the dimensions given in do not match the * Listener#onOutputSizeChanged(int,int) output size after applying the final effect the frames are resized before rendering to the surface and letter/pillar-boxing is applied.

The caller is responsible for tracking the lifecycle of the SurfaceInfo#surface including calling this method with a new surface if it is destroyed. When this method returns, the previous output surface is no longer being used and can safely be released by the caller.

If texture output is set, calling this method will be a no-op.

signalEndOfInput

fun signalEndOfInput(): Unit

Informs the VideoFrameProcessor that no further input frames should be accepted.

Throws
java.lang.IllegalStateException

If called more than once.