Image
  public
  
  
  abstract
  class
  Image
  
    extends Object
  
  
  
  
  
      implements
      
        AutoCloseable
      
  
  
| java.lang.Object | |
| ↳ | android.media.Image | 
A single complete image buffer to use with a media source such as a
 MediaCodec or a
 CameraDevice.
This class allows for efficient direct application access to the pixel
 data of the Image through one or more
 ByteBuffers. Each buffer is encapsulated in a
 Plane that describes the layout of the pixel data in that plane. Due
 to this direct access, and unlike the Bitmap class,
 Images are not directly usable as UI resources.
Since Images are often directly produced or consumed by hardware components, they are a limited resource shared across the system, and should be closed as soon as they are no longer needed.
For example, when using the ImageReader class to read out Images
 from various media sources, not closing old Image objects will prevent the
 availability of new Images once
 the maximum outstanding image count is
 reached. When this happens, the function acquiring new Images will typically
 throw an IllegalStateException.
See also:
Summary
| Nested classes | ||
|---|---|---|
| 
        
        
        
        
        class | Image.PlaneA single color plane of image data. | |
| Public methods | |
|---|---|
| 
        abstract
        
        
        
        
        void | 
      close()
      Free up this frame for reuse. | 
| 
        
        
        
        
        
        Rect | 
      getCropRect()
      Get the crop rectangle associated with this frame. | 
| 
        
        
        
        
        
        int | 
      getDataSpace()
      Get the dataspace associated with this frame. | 
| 
        
        
        
        
        
        SyncFence | 
      getFence()
      Get the SyncFence object associated with this frame. | 
| 
        abstract
        
        
        
        
        int | 
      getFormat()
      Get the format for this image. | 
| 
        
        
        
        
        
        HardwareBuffer | 
      getHardwareBuffer()
      Get the  | 
| 
        abstract
        
        
        
        
        int | 
      getHeight()
      The height of the image in pixels. | 
| 
        abstract
        
        
        
        
        Plane[] | 
      getPlanes()
      Get the array of pixel planes for this Image. | 
| 
        abstract
        
        
        
        
        long | 
      getTimestamp()
      Get the timestamp associated with this frame. | 
| 
        abstract
        
        
        
        
        int | 
      getWidth()
      The width of the image in pixels. | 
| 
        
        
        
        
        
        void | 
      setCropRect(Rect cropRect)
      Set the crop rectangle associated with this frame. | 
| 
        
        
        
        
        
        void | 
      setDataSpace(int dataSpace)
      Set the dataspace associated with this frame. | 
| 
        
        
        
        
        
        void | 
      setFence(SyncFence fence)
      Set the fence file descriptor with this frame. | 
| 
        
        
        
        
        
        void | 
      setTimestamp(long timestamp)
      Set the timestamp associated with this frame. | 
| Inherited methods | |
|---|---|
Public methods
close
public abstract void close ()
Free up this frame for reuse.
 After calling this method, calling any methods on this Image will
 result in an IllegalStateException, and attempting to read from
 or write to ByteBuffers returned by an earlier
 Plane.getBuffer call will have undefined behavior. If the image
 was obtained from ImageWriter via
 ImageWriter.dequeueInputImage(), after calling this method, any
 image data filled by the application will be lost and the image will be
 returned to ImageWriter for reuse. Images given to
 queueInputImage() are automatically
 closed.
 
getCropRect
public Rect getCropRect ()
Get the crop rectangle associated with this frame.
The crop rectangle specifies the region of valid pixels in the image, using coordinates in the largest-resolution plane.
| Returns | |
|---|---|
| Rect | |
getDataSpace
public int getDataSpace ()
Get the dataspace associated with this frame.
| Returns | |
|---|---|
| int | Value is either 0or a combination ofDataSpace.DATASPACE_DEPTH,DataSpace.DATASPACE_DYNAMIC_DEPTH,DataSpace.DATASPACE_HEIF,DataSpace.DATASPACE_HEIF_ULTRAHDR,DataSpace.DATASPACE_JPEG_R,DataSpace.DATASPACE_UNKNOWN,DataSpace.DATASPACE_SCRGB_LINEAR,DataSpace.DATASPACE_SRGB,DataSpace.DATASPACE_SCRGB,DataSpace.DATASPACE_DISPLAY_P3,DataSpace.DATASPACE_BT2020_HLG,DataSpace.DATASPACE_BT2020_PQ,DataSpace.DATASPACE_ADOBE_RGB,DataSpace.DATASPACE_JFIF,DataSpace.DATASPACE_BT601_625,DataSpace.DATASPACE_BT601_525,DataSpace.DATASPACE_BT2020,DataSpace.DATASPACE_BT709,DataSpace.DATASPACE_DCI_P3,DataSpace.DATASPACE_SRGB_LINEAR, and android.hardware.DataSpace.DATASPACE_DISPLAY_BT2020 | 
getFence
public SyncFence getFence ()
Get the SyncFence object associated with this frame.
This function returns an invalid SyncFence after getPlanes() on the image
 dequeued from ImageWriter via ImageWriter.dequeueInputImage().
| Returns | |
|---|---|
| SyncFence | The SyncFence for this frame.
 This value cannot be null. | 
| Throws | |
|---|---|
| IOException | if there is an error when a SyncFence object returns. | 
See also:
getFormat
public abstract int getFormat ()
Get the format for this image. This format determines the number of ByteBuffers needed to represent the image, and the general layout of the pixel data in each ByteBuffer.
 The format is one of the values from
 ImageFormat,
 PixelFormat, or
 HardwareBuffer. The mapping between the
 formats and the planes is as follows (any formats not listed will have 1 plane):
 
| Format | Plane count | Layout details | 
|---|---|---|
| JPEG | 1 | Compressed data, so row and pixel strides are 0. To uncompress, use BitmapFactory#decodeByteArray. | 
| YUV_420_888 | 3 | A luminance plane followed by the Cb and Cr chroma planes. The chroma planes have half the width and height of the luminance plane (4:2:0 subsampling). Each pixel sample in each plane has 8 bits. Each plane has its own row stride and pixel stride. | 
| YUV_422_888 | 3 | A luminance plane followed by the Cb and Cr chroma planes. The chroma planes have half the width and the full height of the luminance plane (4:2:2 subsampling). Each pixel sample in each plane has 8 bits. Each plane has its own row stride and pixel stride. | 
| YUV_444_888 | 3 | A luminance plane followed by the Cb and Cr chroma planes. The chroma planes have the same width and height as that of the luminance plane (4:4:4 subsampling). Each pixel sample in each plane has 8 bits. Each plane has its own row stride and pixel stride. | 
| FLEX_RGB_888 | 3 | A R (red) plane followed by the G (green) and B (blue) planes. All planes have the same widths and heights. Each pixel sample in each plane has 8 bits. Each plane has its own row stride and pixel stride. | 
| FLEX_RGBA_8888 | 4 | A R (red) plane followed by the G (green), B (blue), and A (alpha) planes. All planes have the same widths and heights. Each pixel sample in each plane has 8 bits. Each plane has its own row stride and pixel stride. | 
| RAW_SENSOR | 1 | A single plane of raw sensor image data, with 16 bits per color
     sample. The details of the layout need to be queried from the source of
     the raw sensor data, such as CameraDevice. | 
| RAW_PRIVATE | 1 | A single plane of raw sensor image data of private layout.
   The details of the layout is implementation specific. Row stride and
   pixel stride are undefined for this format. Calling Plane.getRowStride()orPlane.getPixelStride()on RAW_PRIVATE image will cause
   UnSupportedOperationException being thrown. | 
| HEIC | 1 | Compressed data, so row and pixel strides are 0. To uncompress, use BitmapFactory#decodeByteArray. | 
| YCBCR_P010 | 3 | P010 is a 4:2:0 YCbCr semiplanar format comprised of a WxH Y plane followed by a Wx(H/2) Cb and Cr planes. Each sample is represented by a 16-bit little-endian value, with the lower 6 bits set to zero. Since this is guaranteed to be a semi-planar format, the Cb plane can also be treated as an interleaved Cb/Cr plane. | 
| YCBCR_P210 | 3 | P210 is a 4:2:2 YCbCr semiplanar format comprised of a WxH Y plane followed by a WxH Cb and Cr planes. Each sample is represented by a 16-bit little-endian value, with the lower 6 bits set to zero. Since this is guaranteed to be a semi-planar format, the Cb plane can also be treated as an interleaved Cb/Cr plane. | 
| Returns | |
|---|---|
| int | |
See also:
getHardwareBuffer
public HardwareBuffer getHardwareBuffer ()
Get the HardwareBuffer handle of the input image
 intended for GPU and/or hardware access.
 
 The returned HardwareBuffer shall not be used
 after  Image.close() has been called.
 
| Returns | |
|---|---|
| HardwareBuffer | the HardwareBuffer associated with this Image or null if this Image doesn't support
 this feature. (Unsupported use cases include Image instances obtained through MediaCodec, and on versions prior to Android P,ImageWriter). | 
getHeight
public abstract int getHeight ()
The height of the image in pixels. For formats where some color channels are subsampled, this is the height of the largest-resolution plane.
| Returns | |
|---|---|
| int | |
getPlanes
public abstract Plane[] getPlanes ()
Get the array of pixel planes for this Image. The number of planes is
 determined by the format of the Image. The application will get an empty
 array if the image format is PRIVATE, because the image pixel data is not directly accessible. The
 application can check the image format by calling
 Image.getFormat().
| Returns | |
|---|---|
| Plane[] | |
getTimestamp
public abstract long getTimestamp ()
Get the timestamp associated with this frame.
 The timestamp is measured in nanoseconds, and is normally monotonically
 increasing. The timestamps for the images from different sources may have
 different timebases therefore may not be comparable. The specific meaning and
 timebase of the timestamp depend on the source providing images. See
 Camera,
 CameraDevice,
 MediaPlayer and MediaCodec for more details.
 
| Returns | |
|---|---|
| long | |
getWidth
public abstract int getWidth ()
The width of the image in pixels. For formats where some color channels are subsampled, this is the width of the largest-resolution plane.
| Returns | |
|---|---|
| int | |
setCropRect
public void setCropRect (Rect cropRect)
Set the crop rectangle associated with this frame.
The crop rectangle specifies the region of valid pixels in the image, using coordinates in the largest-resolution plane.
| Parameters | |
|---|---|
| cropRect | Rect | 
setDataSpace
public void setDataSpace (int dataSpace)
Set the dataspace associated with this frame.
 If dataspace for an image is not set, dataspace value depends on Surface
 that is provided in the ImageWriter constructor.
 
| Parameters | |
|---|---|
| dataSpace | int: The Dataspace to be set for this image
 Value is either0or a combination ofDataSpace.DATASPACE_DEPTH,DataSpace.DATASPACE_DYNAMIC_DEPTH,DataSpace.DATASPACE_HEIF,DataSpace.DATASPACE_HEIF_ULTRAHDR,DataSpace.DATASPACE_JPEG_R,DataSpace.DATASPACE_UNKNOWN,DataSpace.DATASPACE_SCRGB_LINEAR,DataSpace.DATASPACE_SRGB,DataSpace.DATASPACE_SCRGB,DataSpace.DATASPACE_DISPLAY_P3,DataSpace.DATASPACE_BT2020_HLG,DataSpace.DATASPACE_BT2020_PQ,DataSpace.DATASPACE_ADOBE_RGB,DataSpace.DATASPACE_JFIF,DataSpace.DATASPACE_BT601_625,DataSpace.DATASPACE_BT601_525,DataSpace.DATASPACE_BT2020,DataSpace.DATASPACE_BT709,DataSpace.DATASPACE_DCI_P3,DataSpace.DATASPACE_SRGB_LINEAR, and android.hardware.DataSpace.DATASPACE_DISPLAY_BT2020 | 
setFence
public void setFence (SyncFence fence)
Set the fence file descriptor with this frame.
| Parameters | |
|---|---|
| fence | SyncFence: The fence file descriptor to be set for this frame.
 This value cannot benull. | 
| Throws | |
|---|---|
| IOException | if there is an error when setting a SyncFence. | 
See also:
setTimestamp
public void setTimestamp (long timestamp)
Set the timestamp associated with this frame.
 The timestamp is measured in nanoseconds, and is normally monotonically
 increasing. The timestamps for the images from different sources may have
 different timebases therefore may not be comparable. The specific meaning and
 timebase of the timestamp depend on the source providing images. See
 Camera,
 CameraDevice,
 MediaPlayer and MediaCodec for more details.
 
 For images dequeued from ImageWriter via
 ImageWriter.dequeueInputImage(), it's up to the application to
 set the timestamps correctly before sending them back to the
 ImageWriter, or the timestamp will be generated automatically when
 queueInputImage() is called.
 
| Parameters | |
|---|---|
| timestamp | long: The timestamp to be set for this image. | 
Content and code samples on this page are subject to the licenses described in the Content License. Java and OpenJDK are trademarks or registered trademarks of Oracle and/or its affiliates.
Last updated 2025-02-13 UTC.
