Providing audio playback for Auto

Drivers want to access their music and other audio content on the road. Audio books, podcasts, sports commentary, and recorded talks can make a long trip educational, inspirational, and enjoyable. The Android framework allows you to extend your audio app so users can listen to their favorite tunes and audio content using a simple, yet customizable user interface.

Apps running on mobile devices with Android 5.0 or higher can provide audio services for Android Auto. By configuring your app with a few settings and implementing a service for accessing music tracks, you can enable Android Auto to discover your app and provide a browse and playback interface for your app's audio content.

This class assumes that you have built an app that plays audio through an Android device's integrated speakers or connected headphones. It describes how to extend your app to allow Android Auto to browse your content listings and play them through a car stereo system.

Provide audio services

Audio apps do not directly control a car dashboard device or a phone that runs Android Auto. When the user connects an Android mobile device into a dashboard system or launches Android Auto on a phone, Android Auto discovers your app through manifest entries that indicate what audio services your app can provide. The Android Auto user interface displays a launcher icon for your app as a music provider and the user can choose to use your app's services. If the user launches your app, Android Auto queries your app to see what content is available, displays your content items to the user, and sends requests to your app to control playback with actions such as play, pause, or skip track.

To enable your app to provide audio content for Android Auto, you need to:

  • Configure your app manifest to do the following:
    • Declare that your app can provide audio content for Android Auto.
    • Define a service that provides a browsable list of your audio tracks.
  • Build a service that provides audio track listing information extending MediaBrowserServiceCompat.
  • Register a MediaSessionCompat object and implement the MediaSessionCompat.Callback object to enable playback controls.

Configure your manifest

When a user plugs an Android mobile device into a dashboard device running Auto, the system requests a list of installed apps that include app manifest entries to indicate they support services for Android Auto and how to access them. This section describes how to configure your app manifest to indicate your app supports audio services for Android Auto, and allow Android Auto to connect with your app.

Declare Auto audio support

You indicate that your app supports cars capabilities using the following manifest entry:

<application>
    ...
    <meta-data android:name="com.google.android.gms.car.application"
        android:resource="@xml/automotive_app_desc"/>
    ...
<application>

This manifest entry refers to a secondary XML file, where you declare what Auto capabilities your app supports. For an app that supports audio for Android Auto, add an XML file to the res/xml/ resources directory as automotive_app_desc.xml, with the following content:

<automotiveApp>
    <uses name="media"/>
</automotiveApp>

For more information about declaring capabilities for Android Auto, see Get started with Auto.

Declare your media browser service

Android Auto expects to connect to a service in order to browse audio track listings. You declare this service in your manifest to allow the dashboard system to discover this service and connect to your app.

The following code example shows how to declare this listing browser service in your manifest:

<application>
    ...
    <service android:name=".MyMediaBrowserService"
                android:exported="true">
        <intent-filter>
            <action android:name=
                "android.media.browse.MediaBrowserService"/>
        </intent-filter>
    </service>
    ...
<application>

The service your app provides for browsing audio tracks must extend the MediaBrowserServiceCompat. The implementation of this service is discussed in the Build a Browser Service section.

Note: Other clients can also contact your app's browser service aside from Android Auto. These media clients might be other apps on a user's mobile device, or they might be other remote clients.

Specify a notification icon

The Auto user interface shows notifications about your audio app to the user during the course of operation. For example, if the user has a navigation app running, and one song finishes and a new song starts, Android Auto shows the user a notification to indicate the change with an icon from your app. You can specify an icon that is used to represent your app for these notifications using the following manifest declaration:

<application>
    ...
    <meta-data android:name="com.google.android.gms.car.notification.SmallIcon"
        android:resource="@drawable/ic_notification" />
    ...
<application>

Note: The icon you provide should have transparency enabled, so the icon's background gets filled in with the app's primary color.

Detect car mode

To prevent driver distraction, Android Auto media apps must not start playing audio through the car speakers unless the user consciously starts playback (such as when the user presses play in your app). Even a user-scheduled alarm from the media app must not start playing music through the car speakers. Your app should determine if the phone is in car mode before playing any audio. Your app can do this by calling UiModeManager.getCurrentModeType(), which checks whether the device is running in car mode.

If the device is in car mode, media apps that support alarms must do one of the following things:

  • Disable the alarm.
  • Play the alarm over STREAM_ALARM, and provide a UI on the phone screen to disable the alarm.

The following code snippet checks whether an app is running in car mode:

Kotlin

fun isCarUiMode(c: Context): Boolean {
    val uiModeManager = c.getSystemService(Context.UI_MODE_SERVICE) as UiModeManager
    return if (uiModeManager.currentModeType == Configuration.UI_MODE_TYPE_CAR) {
        LogHelper.d(TAG, "Running in Car mode")
        true
    } else {
        LogHelper.d(TAG, "Running on a non-Car mode")
        false
    }
}

Java

 public static boolean isCarUiMode(Context c) {
      UiModeManager uiModeManager = (UiModeManager) c.getSystemService(Context.UI_MODE_SERVICE);
      if (uiModeManager.getCurrentModeType() == Configuration.UI_MODE_TYPE_CAR) {
            LogHelper.d(TAG, "Running in Car mode");
            return true;
      } else {
          LogHelper.d(TAG, "Running on a non-Car mode");
          return false;
        }
  }

Handle media advertisements

By default, Android Auto displays a notification when the media metadata changes during an audio playback session. When a media app switches from playing music to running an advertisement, it is distracting (and unnecessary) to display a notification to the user. To prevent Android Auto from displaying a notification in this case, you must set the media metadata key android.media.metadata.ADVERTISEMENT to 1, as shown in the code snippet below:

Kotlin

const val EXTRA_METADATA_ADVERTISEMENT = "android.media.metadata.ADVERTISEMENT"
...
override fun onPlayFromMediaId(mediaId: String, extras: Bundle?) {
    MediaMetadataCompat.Builder().apply {
        // ...
        if (isAd(mediaId)) {
            putLong(EXTRA_METADATA_ADVERTISEMENT, 1)
        }
        // ...
        mediaSession.setMetadata(build())
    }
}

Java

public static final String EXTRA_METADATA_ADVERTISEMENT =
            "android.media.metadata.ADVERTISEMENT";

@Override
public void onPlayFromMediaId(String mediaId, Bundle extras) {
    MediaMetadataCompat.Builder builder = new MediaMetadataCompat.Builder();
    // ...
    if (isAd(mediaId)) {
        builder.putLong(EXTRA_METADATA_ADVERTISEMENT, 1);
    }
    // ...
    mediaSession.setMetadata(builder.build());
}

Build a browser service

Android Auto interact with your app by contacting its implementation of a MediaBrowserServiceCompat, which you declare in your app manifest. This service allows Android Auto to find out what content your app provides. Android Auto can also query your app's media browser service to contact the MediaSessionCompat provided by your app, which handles content playback commands.

You create a media browser service by extending the MediaBrowserServiceCompat class. Android Auto can contact your service to do the following:

  • Browse your app's content hierarchy, in order to present a menu to the user
  • Get the token for your app's MediaSessionCompat object, in order to control audio playback

Media browser service workflow

  1. When your app's audio services are requested by a user through Android Auto, Android Auto contacts your app's media browser service. In your implementation of the onCreate() method, you must create and register a MediaSessionCompat object and its callback object.
  2. Android Auto calls the browser service's onGetRoot() method to get the top node of your content hierarchy. The node retrieved by this call is not used as a menu item. It is only used to retrieve its child nodes, which are subsequently displayed as the top menu items.
  3. Auto invokes the onLoadChildren() method to get the children of the root node, and uses this information to present a menu to the user.
  4. If the user selects a submenu, Auto invokes onLoadChildren() again to retrieve the child nodes of the selected menu item.
  5. If the user begins playback, Auto invokes the appropriate media session callback method to perform that action. For more information, see the section about how to Implement playback controls.
  6. Optionally, add support for displaying search results that satisfy a user’s query. For example, if your app plays audio in response to a user’s voice query, you can also display a list of other results that may interest the user using the onSearch() method. To learn more, go to the section about how to Display search results.

To help users quickly browse your app's content, Android Auto includes a browsing capability that lets users select a letter from an on-screen keyboard. The user is then presented with a list of items beginning with that letter in the current drawer list. This works on both sorted and unsorted content, and is currently available only in English.

Figure 1. Alpha picker on car screen.

Figure 2. Alpha picker on phone screen.

Figure 3. Alpha keyboard on phone screen.

Building your content hierarchy

Android Auto, when acting as an audio client, calls your app's MediaBrowserServiceCompat to find out what content you have available. You need to implement two methods in your browser service to support this: onGetRoot() and onLoadChildren().

Each node in your content hierarchy is represented by a MediaBrowserCompat.MediaItem object. Each of these objects is identified by a unique ID string. The client treats these ID strings as opaque tokens. When a client wants to browse to a submenu, or play a content item, it passes the ID token. Your app is responsible for associating the ID token with the appropriate menu node or content item.

Note: You should consider providing different content hierarchies depending on what client is making the query. In particular, Auto applications have strict limits on how large a menu they can display. This is intended to minimize distracting the driver, and to make it easy for the driver to operate the app via voice commands. To encourage drivers to focus on driving, Android Auto triggers a speed bump notification on the phone screen (a temporary lock on content browsing), if the user exceeds the expected threshold for situationally aware driving. For more information on the Auto user experience restrictions, see the Auto audio apps guidelines.

Your implementation of onGetRoot() returns information about the root node of the menunhierarchy. This root node is the parent of the top items of your browse hierarchy. The method is passed information about the calling client. You can use this information to decide if the client should have access to your content at all. For example, if you want to limit your app's content to a list of approved clients, you can compare the passed clientPackageName to your whitelist and verify the certificate used to sign the caller's APK. If the caller can't be verified to be an approved package, return null to deny access to your content. For an example of an app that validates that the caller is an approved app, see the PackageValidator class in the Universal Android Music Player sample app.

A typical implementation of onGetRoot() might look like this:

Kotlin

override fun onGetRoot(
        clientPackageName: String,
        clientUid: Int,
        rootHints: Bundle?
): BrowserRoot? =
        // Verify that the specified package is allowed to access your
        // content! You'll need to write your own logic to do this.
        if (!isValid(clientPackageName, clientUid)) {
            // If the request comes from an untrusted package, return null.
            // No further calls will be made to other media browsing methods.

            null
        } else MediaBrowserServiceCompat.BrowserRoot(MY_MEDIA_ROOT_ID, null)

Java

@Override
public BrowserRoot onGetRoot(String clientPackageName, int clientUid,
    Bundle rootHints) {

    // Verify that the specified package is allowed to access your
    // content! You'll need to write your own logic to do this.
    if (!isValid(clientPackageName, clientUid)) {
        // If the request comes from an untrusted package, return null.
        // No further calls will be made to other media browsing methods.

        return null;
    }

    return new BrowserRoot(MY_MEDIA_ROOT_ID, null);
}

The Auto device client builds the top-level menu by calling onLoadChildren() with the root node object and getting its children. The client builds submenus by calling the same method with other child nodes. The following example code shows a simple implementation of onLoadChildren() method:

Kotlin

override fun onLoadChildren(
        parentMediaId: String,
        result: Result<List<MediaBrowserCompat.MediaItem>>
) {
    // Assume for example that the music catalog is already loaded/cached.

    val mediaItems: MutableList<MediaBrowserCompat.MediaItem> = mutableListOf()

    // Check if this is the root menu:
    if (MY_MEDIA_ROOT_ID == parentMediaId) {

        // build the MediaItem objects for the top level,
        // and put them in the mediaItems list
    } else {

        // examine the passed parentMediaId to see which submenu we're at,
        // and put the children of that menu in the mediaItems list
    }
    result.sendResult(mediaItems)
}

Java

@Override
public void onLoadChildren(final String parentMediaId,
    final Result<List<MediaBrowserCompat.MediaItem>> result) {

    // Assume for example that the music catalog is already loaded/cached.

    List<MediaBrowserCompat.MediaItem> mediaItems = new ArrayList<>();

    // Check if this is the root menu:
    if (MY_MEDIA_ROOT_ID.equals(parentMediaId)) {

        // build the MediaItem objects for the top level,
        // and put them in the mediaItems list
    } else {

        // examine the passed parentMediaId to see which submenu we're at,
        // and put the children of that menu in the mediaItems list
    }
    result.sendResult(mediaItems);
}

For examples of how to implement onLoadChildren(), see the MediaBrowserService and Universal Android Music Player sample apps.

Apply content style

As described in previous sections, Android Auto supports connecting to your media app’s MediaBrowserService and populating the drawer with that app’s browse tree. The platform also provides additional ways for you to change how your content is organized and displayed on the device screen.

For example, you can mark MediaItem objects as either browseable or playable. An item that is playable opens the playback view and begin playing content when selected. A radio station would be a good example of a playable item. An item that is browsable causes the platform to fetch that item’s children and presents them in the system UI after they are loaded. A folder called ‘My Radio Stations’ is a good example of a browsable item.

Note: A MediaItem that is both browsable and playable is treated as playable.

Additionally, you can decide how these different types of items are presented. There are multiple possible presentations available, as follows:

  • List items prioritize titles and metadata over images, as shown below.
  • Grid items prioritize images over titles and metadata, as shown below.
  • Title items act as subgroup headings for organizing your content. They are generated based off extra media metadata provided in each media item. The figure below shows list items displayed

Set global style

You can set a global defaults for how your media items are displayed by applying certain constants in the BrowserRoot extras bundle returned by the onGetRoot() function. Android Auto reads the extras associated with each item in the browse tree and looks for the specific constants described below, and then use the presence/value of each key to add the appropriate indicator.

The constants available are shown below. Note that the Android media framework and support libraries do not yet natively support these constants, so there is no library for them. Include the contents of the code block below in your app and use the constants in your BrowserRoot extras.

/** Declares that ContentStyle is supported */
public static final String CONTENT_STYLE_SUPPORTED = "android.media.browse.CONTENT_STYLE_SUPPORTED";

/**
* Bundle extra indicating the presentation hint for playable media items.
*/
public static final String CONTENT_STYLE_PLAYABLE_HINT =
   "android.media.browse.CONTENT_STYLE_PLAYABLE_HINT";

/**
* Bundle extra indicating the presentation hint for browsable media items.
*/
public static final String CONTENT_STYLE_BROWSABLE_HINT =
   "android.media.browse.CONTENT_STYLE_BROWSABLE_HINT";

/**
* Specifies the corresponding items should be presented as lists.
*/
public static final int CONTENT_STYLE_LIST_ITEM_HINT_VALUE = 1;

/**
* Specifies that the corresponding items should be presented as grids.
*/
public static final int CONTENT_STYLE_GRID_ITEM_HINT_VALUE = 2;

The following code sample sets the base defaults to have browsable items presented as grids and playable items presented as lists.

@Nullable
@Override
public BrowserRoot onGetRoot(@NonNull String clientPackageName, int clientUid,
   @Nullable Bundle rootHints) {
   Bundle extras = new Bundle();
   extras.putBoolean(CONTENT_STYLE_SUPPORTED, true);
   extras.putInt(CONTENT_STYLE_BROWSABLE_HINT, CONTENT_STYLE_GRID_ITEM_HINT_VALUE);
   extras.putInt(CONTENT_STYLE_PLAYABLE_HINT, CONTENT_STYLE_LIST_ITEM_HINT_VALUE);
   return new BrowserRoot(ROOT_ID, extras);
}

Set per-node style

The Content Style API supports overriding the default global hint for any browsable node’s children. The same extras as above can be supplied as extras in the MediaDescription. If these extras are present, then the children of that browsable node will have the new Content Style hint.

Note: The hint applies only to children of that browsable node, not grandchildren. Also, (CONTENT_STYLE_SUPPORTED, true) is not required.

The code sample below creates a browsable MediaItem that overrides the base defaults:

private MediaBrowser.MediaItem createBrowsableMediaItem(String mediaId, String folderName, Uri iconUri) {
   MediaDescription.Builder mediaDescriptionBuilder = new  MediaDescription.Builder();
   mediaDescriptionBuilder.setMediaId(mediaId);
   mediaDescriptionBuilder.setTitle(folderName);
   mediaDescriptionBuilder.setIconUri(iconUri);
   Bundle extras = new Bundle();
   extras.putInt(CONTENT_STYLE_BROWSABLE_HINT, CONTENT_STYLE_LIST_ITEM_HINT_VALUE);
   extras.putInt(CONTENT_STYLE_PLAYABLE_HINT, CONTENT_STYLE_GRID_ITEM_HINT_VALUE);
   return new MediaBrowser.MediaItem(
       mediaDescriptionBuilder.build(), MediaBrowser.MediaItem.FLAG_BROWSABLE);
}

Add title items

You can organize content using title items to group media in a list. To do this, every media item in the group needs to declare an extra in their media description with the same string value, which you can localize. This value is used as the group title.

You also need to pass the media items together and in the order you want them displayed. That is, Android Auto does not sort or reshuffle media items based on their group declaration. Consider if you pass three media items in the following order: media item 1 has title “Songs”, media item 2 has title “Albums”, media item 3 has title “Songs”. Android Auto does not merge media item 1 and media item 3 into the same group, "Songs".

The following code sample creates a MediaItem that with a subgroup heading of "Songs".

private MediaBrowser.MediaItem createMediaItem(String mediaId, String folderName, Uri iconUri) {
   MediaDescription.Builder mediaDescriptionBuilder = new  MediaDescription.Builder();
   mediaDescriptionBuilder.setMediaId(mediaId);
   mediaDescriptionBuilder.setTitle(folderName);
   mediaDescriptionBuilder.setIconUri(iconUri);
   Bundle extras = new Bundle();
   extras.putString(CONTENT_STYLE_GROUP_TITLE_HINT, "Songs");
   return new MediaBrowser.MediaItem(
       mediaDescriptionBuilder.build(), /* playable or browsable flag*/);
}

Include additional indicators

You can include additional metadata indicators to provide at-a-glance information about content in the browse tree and media playback view. For example, the screenshot below includes icons that indicate that the song is explicit and downloaded for offline.

Android Auto inspects extras for each item in the browse tree and looks for the specific keys for the indicators, and then uses the presence/value of each key to add the appropriate indicator. The keys for each supported indictor is shown below.

/** Bundle extra indicating that a song is explicit. */
String EXTRA_IS_EXPLICIT = "android.media.IS_EXPLICIT";

/**
 * Bundle extra indicating that a media item is available offline.
 * Same as MediaDescriptionCompat.EXTRA_DOWNLOAD_STATUS.
 */
String EXTRA_IS_DOWNLOADED = "android.media.extra.DOWNLOAD_STATUS";

/**
 * Bundle extra value indicating that an item should show the corresponding
 * metadata.
 */
long EXTRA_METADATA_ENABLED_VALUE = 1;

/**
 * Bundle extra indicating the played state of long-form content (such as podcast
 * episodes or audiobooks).
 */
String EXTRA_PLAY_COMPLETION_STATE = "android.media.extra.PLAYBACK_STATUS";

/**
 * Value for EXTRA_PLAY_COMPLETION_STATE that indicates the media item has
 * not been played at all.
 */
int STATUS_NOT_PLAYED = 0;

/**
 * Value for EXTRA_PLAY_COMPLETION_STATE that indicates the media item has
 * been partially played (i.e. the current position is somewhere in the middle).
 */
int STATUS_PARTIALLY_PLAYED = 1;

/**
 * Value for EXTRA_PLAY_COMPLETION_STATE that indicates the media item has
 * been completed.
 */
int STATUS_FULLY_PLAYED = 2;

Note that the Android media framework and support libraries do not yet natively support these indicators, so there is no library for the constants below. Simply include the contents of the code block below in your app and use the constants where appropriate.

You should add these extras to content returned by your MediaBrowse Service. “Explicit” and “Downloaded” are boolean extras (set to true to show the indicator), while “Completion State” is an integer extra set to the appropriate value. Apps should create an extras bundle that includes one or more of these keys and pass that to MediaDescription.Builder.setExtras(). For example, the following sample sets up an explicit, partially-played media item:

Bundle extras = new Bundle();
extras.putLong(EXTRA_IS_EXPLICIT, 1);
extras.putInt(EXTRA_PLAY_COMPLETION_STATE, STATUS_PARTIALLY_PLAYED);

MediaDescriptionCompat description = new MediaDescriptionCompat.Builder()
  .setMediaId(/*...*/)
  .setTitle(resources.getString(/*...*/))
  .setExtras(extras)
  .build();
return new MediaBrowserCompat.MediaItem(description, /* flags */);

Playback view also supports indicators in the media session for the currently playing song. However, only “Explicit” and “Downloaded” are currently supported. The following sample indicates that the current song in the playback view is explicit and downloaded.

mediaSession.setMetadata(
    new MediaMetadata.Builder()
        .putString(
            MediaMetadata.METADATA_KEY_DISPLAY_TITLE, "Song Name")
        .putString(
            MediaMetadata.METADATA_KEY_DISPLAY_SUBTITLE, "Artist name")
        .putString(MediaMetadata.METADATA_KEY_ALBUM_ART_URI, albumArtUri.toString())
        .putLong(
            EXTRA_IS_EXPLICIT, EXTRA_METADATA_ENABLED_VALUE)
        .putLong(
            EXTRA_IS_DOWNLOADED, EXTRA_METADATA_ENABLED_VALUE)
        .build());

After a user performs a voice search for media in your app, you can allow the user to browse a group of additional selectable options of content related to that query. This appears as a "Show more results" bar, as shown below.

First, you need to declare support for onSearch() in your MediaBrowserServiceCompat implementation, include the value below as a boolean key set to “true” when creating a browser root anytime Android Auto connects to the BrowserService. The following code sample enables support in the onGetRoot() method.

@Nullable
@Override
public BrowserRoot onGetRoot(@NonNull String clientPackageName, int clientUid,
   @Nullable Bundle rootHints) {
   Bundle extras = new Bundle();
   extras.putBoolean(MEDIA_SEARCH_SUPPORTED, true);
   return new BrowserRoot(ROOT_ID, extras);
}

Note: The Android media framework and support libraries do not yet natively support these indicators, so there is no library for the constant below. Include the contents of the code block below in your app and use the constant when creating a browser root. The code block after the constant definition shows how to do this.

To start providing search results to Android Auto, override the onSearch() method in the MediaBrowserServiceCompat implementation. Android Auto forwards a user’s search terms to this method whenever a user invokes the “Show more results” affordance, as shown in the code sample below.

@Override
public void onSearch(final String query, final Bundle extras,
                        Result> result) {

  // Detach from results to unblock the caller (if a search is expensive)
  result.detach();

  new AsyncTask() {
    ArrayList searchResponse;
    boolean succeeded = false;
    @Override
    protected Void doInBackground(Void… params) {
      searchResponse = new ArrayList();
      if (doSearch(query, extras, searchResponse)) {
        succeeded = true;
      }
      return null;
    }

    @Override
    protected void onPostExecute(Void param) {
      if (succeeded) {
       // Sending an empty List informs the caller that there were no results.
       result.sendResult(searchResponse);
      } else {
        // This invokes onError() on the search callback
        result.sendResult(null);
      }
      return null;
    }
  }.execute()
}

/** Populates resultsToFill with search results.  Returns true on success or false on error */
private boolean doSearch(String query, Bundle extras, ArrayList resultsToFill) {
    // Implement this method
}

Android Auto calls onSearch with the same Bundle of extras as the one defined for Android Auto’s playFromSearch() calls. Unlike playFromSearch(), onSearch() includes a Result<ArrayList<MediaItem>> that can be used to return multiple MediaItems back to Android Auto for display.

You can then categorize search results using title items. For example, music apps may include categories such as “Album”, “Artist” and “Songs”.

Enable playback control

Android Auto sends playback control commands through your MediaSessionCompat. You must register a session and implement the associated callback methods.

Note: You can inspect the media session at any time with the command adb shell dumpsys media_session.

Register a media session

In your browser service's onCreate() method, create a MediaSessionCompat. Register the media session by calling setSessionToken().

Kotlin

override fun onCreate() {
    super.onCreate()

    ...
    // Start a new MediaSession
    val session = MediaSessionCompat(this, "session tag").apply {
        // Set a callback object to handle play control requests, which
        // implements MediaSession.Callback
        setCallback(MyMediaSessionCallback())
    }
    sessionToken = session.sessionToken

    ...
}

Java

public void onCreate() {
    super.onCreate();

    ...
    // Start a new MediaSession
    MediaSessionCompat session = new MediaSessionCompat(this, "session tag");
    setSessionToken(session.getSessionToken());

    // Set a callback object to handle play control requests, which
    // implements MediaSession.Callback
    session.setCallback(new MyMediaSessionCallback());

    ...

When you create the media session object, you set a callback object that is used to handle playback control requests. You create this callback object by providing an implementation of the MediaSessionCompat.Callback class for your app. The next section discusses how to implement this object.

Implement play commands

When an Android Auto user requests playback of an audio track from your app, Android Auto uses the MediaSessionCompat.Callback class from your app's MediaSessionCompat object, which it obtained from your app's media browse service. When an Auto user wants to play content or control content playback, such as pausing play or skipping to the next track, Auto invokes one of the callback object's methods.

To handle content playback, your app must extend the abstract MediaSessionCompat.Callback class and implement the methods that your app supports. The most important callback methods are as follows:

onPlay()
Invoked if the user chooses play without choosing a specific item. Your app should play its default content. If playback was paused with onPause(), your app should resume playback.

Note: Google Play requires your app not to play music immediately when it launches. For more information on this and other requirements, see Auto app quality.

onPlayFromMediaId()
Invoked when the user chooses to play a specific item. The method is passed the item's media ID, which you assigned to the item in the content hierarchy.
onPlayFromSearch()
Invoked when the user chooses to play from a search query. The app should make an appropriate choice based on the passed search string.
onPause()
Pause playback.
onSkipToNext()
Skip to the next item.
onSkipToPrevious()
Skip to the previous item.
onStop()
Stop playback.

Your app should override these methods to provide any desired functionality. In some cases you might not implement a method if it is not supported by your app. For example, if your app plays a live stream (such as a sports broadcast), the skip to next function might not make sense. In that case, you could simply use the default implementation of onSkipToNext().

When your app receives a request to play content, it should play audio the same way it would in a non-Auto situation (as if the user were listening through a device speaker or through connected headphones). If the app is running on an Android Auto supported car screen, the audio content is automatically sent to the dashboard system to be played over the car's speakers.

For more information about playing audio content, see Media playback, Managing audio playback, and ExoPlayer.

Setting standard playback actions

The Android Auto UI displays playback controls based on the actions that are enabled in the PlaybackState. Audio apps usually enable the standard actions ACTION_PLAY, ACTION_PAUSE, ACTION_STOP, ACTION_SKIP_TO_PREVIOUS, and ACTION_SKIP_TO_NEXT.

Android Auto apps must also support ACTION_PLAY_FROM_MEDIA_ID and ACTION_PLAY_FROM_SEARCH.

In addition, you might want to create a play queue, which will also appear in the UI. You need to call setQueue() and setQueueTitle(), and also enable ACTION_SKIP_TO_QUEUE_ITEM. In addition, don't forget to define the callback onSkipToQueueItem().

Android Auto displays buttons in the UI for each enabled action, and the playback queue if you create one.

Android Auto reserves space in its UI for the playback queue and the ACTION_SKIP_TO_PREVIOUS or ACTION_SKIP_TO_PREVIOUS actions. If your app does not support any of these functions the UI will not display controls for them. It will assign the unused space to any custom actions you create. If you do not want to fill those spaces with custom actions, you can "reserve" them so that Android Auto will display the corresponding button when an action is enabled, and leave the space blank when the action is not enabled or the play queue does not exist.

To reserve space, call setExtras() with a bundle that contains the keys defined below. Set each key to the boolean value true:

Kotlin

// Use these extras to show the transport control buttons for the corresponding actions,
// even when they are not enabled in the PlaybackState.
private const val SLOT_RESERVATION_SKIP_TO_NEXT =
        "com.google.android.gms.car.media.ALWAYS_RESERVE_SPACE_FOR.ACTION_SKIP_TO_NEXT"
private const val SLOT_RESERVATION_SKIP_TO_PREV =
        "com.google.android.gms.car.media.ALWAYS_RESERVE_SPACE_FOR.ACTION_SKIP_TO_PREVIOUS"
private const val SLOT_RESERVATION_QUEUE =
        "com.google.android.gms.car.media.ALWAYS_RESERVE_SPACE_FOR.ACTION_QUEUE"

Java

// Use these extras to show the transport control buttons for the corresponding actions,
// even when they are not enabled in the PlaybackState.
private static final String SLOT_RESERVATION_SKIP_TO_NEXT =
    "com.google.android.gms.car.media.ALWAYS_RESERVE_SPACE_FOR.ACTION_SKIP_TO_NEXT";
private static final String SLOT_RESERVATION_SKIP_TO_PREV =
    "com.google.android.gms.car.media.ALWAYS_RESERVE_SPACE_FOR.ACTION_SKIP_TO_PREVIOUS";
private static final String SLOT_RESERVATION_QUEUE =
    "com.google.android.gms.car.media.ALWAYS_RESERVE_SPACE_FOR.ACTION_QUEUE";

Providing custom actions

You can add additional custom playback actions to the transport UI. Use the PlaybackStateCompat.Builder to add these actions. See an example of adding actions and the required callback in Add a custom action. If space permits, Android adds the custom actions to the transport controls, otherwise, they are displayed in the overflow menu. Custom actions appear in the order in which they are added to the PlaybackState.

Icons for custom actions

Each custom action requires an icon resource. Since apps that work with Auto are designed to run in cars with different screen sizes and densities, it is important that you provide your app’s custom icons for different screen densities. This will help avoid blurring or other scaling artifacts. Here are some tips that you might find useful as you develop custom icons for your application.

Use vector format where possible

Use the vector format for custom icons whenever possible. A vector drawable allows you to scale assets without losing the detail. A vector drawable also makes it easy to align edges and corners to pixel boundaries at smaller resolutions.

Provide drawables in multiple densities

If you must provide icons as bitmap drawables (.png, .jpg, and .gif files) and Nine-Patch drawables (.9.png files), then as a minimum, supply a version of each icon that's optimized for the following common car screen densities:

  • mdpi (medium) ~160dpi
  • hdpi (high) ~240dpi
  • xhdpi (extra-high) ~320dpi

It is preferred to have your custom icons in the following densities as well:

  • xxhdpi (extra-extra-high) ~480dpi
  • xxxhdpi (extra-extra-extra-high) ~640dpi (optional)

For more information about designing for different screens, see the Supporting multiple screens developer guide.

Provide off icon style for disabled actions

For cases when a custom action is unavailable for the current context, swap the custom action icon with a corresponding off icon style resource.

Sample off style custom action icons.

Support voice actions

To reduce driver distractions, you must add voice actions in your audio playback app. With voice action support, users can launch your app and play audio by providing voice input on Auto screens. If your audio playback app is already active and the user says “Play a song”, the system starts playing music without requiring the user to look at or touch the screen.

Enable your app to handle audio playback requests

Enable your audio app to launch with a voice command such as "Play [search query] on [your app name]" by adding the following entry in your manifest:

<activity>
    <intent-filter>
        <action android:name=
             "android.media.action.MEDIA_PLAY_FROM_SEARCH" />
        <category android:name=
             "android.intent.category.DEFAULT" />
    </intent-filter>
</activity>

When the user says “Play music on [your app name]” on an Auto screen, Auto attempts to launch your app and play audio by calling your app’s MediaSession.Callback.onPlayFromSearch() method. If the user has not specified criteria such as a track name or music genre, the MediaSession.Callback.onPlayFromSearch() method receives an empty query parameter. Your app should respond by immediately playing audio, such as a song from a random queue or the most recent playlist.

Parse the voice query to build the playback queue

When a user searches for a specific criteria, such as “Play jazz on [your app name]” or “Listen to [song title]”, the onPlayFromSearch() callback method receives the voice search results in the query parameter and an extras bundle. For more information on how to handle search queries to play audio content, see Play music based on a search query.

To parse the voice search query to play back audio content in your app, follow these steps:

  1. Use the extras bundle and search query string returned from the voice search to filter results.
  2. Build the audio content queue based on these results.
  3. Play the audio content.

The onPlayFromSearch() method takes an extras parameter with more detailed information from the voice search. These extras help you find the audio content in your app for playback. If the search results are unable to provide this data, you can implement logic to parse the raw search query and play the appropriate tracks based on the query.

The following extras are supported in Android Auto:

The following snippet shows how to override the onPlayFromSearch() method in your MediaSession.Callback implementation to handle the search query and extras for playing audio content in your app:

Kotlin

override fun onPlayFromSearch(query: String?, extras: Bundle?) {
    if (query.isNullOrEmpty()) {
        // The user provided generic string e.g. 'Play music'
        // Build appropriate playlist queue
    } else {
        // Build a queue based on songs that match "query" or "extras" param
        val mediaFocus: String? = extras?.getString(MediaStore.EXTRA_MEDIA_FOCUS)
        if (mediaFocus == MediaStore.Audio.Artists.ENTRY_CONTENT_TYPE) {
            isArtistFocus = true
            artist = extras.getString(MediaStore.EXTRA_MEDIA_ARTIST)
        } else if (mediaFocus == MediaStore.Audio.Albums.ENTRY_CONTENT_TYPE) {
            isAlbumFocus = true
            album = extras.getString(MediaStore.EXTRA_MEDIA_ALBUM)
        }

        // Implement additional "extras" param filtering
    }

    // Implement your logic to retrieve the queue
    var result: String? = when {
        isArtistFocus -> artist?.also {
            searchMusicByArtist(it)
        }
        isAlbumFocus -> album?.also {
            searchMusicByAlbum(it)
        }
        else -> null
    }
    result = result ?: run {
        // No focus found, search by query for song title
        query?.also {
            searchMusicBySongTitle(it)
        }
    }

    if (result?.isNotEmpty() == true) {
        // Immediately start playing from the beginning of the search results
        // Implement your logic to start playing music
        playMusic(result)
    } else {
        // Handle no queue found. Stop playing if the app
        // is currently playing a song
    }
}

Java

@Override
public void onPlayFromSearch(String query, Bundle extras) {
    if (TextUtils.isEmpty(query)) {
        // The user provided generic string e.g. 'Play music'
        // Build appropriate playlist queue
    } else {
        // Build a queue based on songs that match "query" or "extras" param
        String mediaFocus = extras.getString(MediaStore.EXTRA_MEDIA_FOCUS);
        if (TextUtils.equals(mediaFocus,
                MediaStore.Audio.Artists.ENTRY_CONTENT_TYPE)) {
            isArtistFocus = true;
            artist = extras.getString(MediaStore.EXTRA_MEDIA_ARTIST);
        } else if (TextUtils.equals(mediaFocus,
                MediaStore.Audio.Albums.ENTRY_CONTENT_TYPE)) {
            isAlbumFocus = true;
            album = extras.getString(MediaStore.EXTRA_MEDIA_ALBUM);
        }

        // Implement additional "extras" param filtering
    }

    // Implement your logic to retrieve the queue
    if (isArtistFocus) {
        result = searchMusicByArtist(artist);
    } else if (isAlbumFocus) {
        result = searchMusicByAlbum(album);
    }

    if (result == null) {
        // No focus found, search by query for song title
        result = searchMusicBySongTitle(query);
    }

    if (result != null && !result.isEmpty()) {
        // Immediately start playing from the beginning of the search results
        // Implement your logic to start playing music
        playMusic(result);
    } else {
        // Handle no queue found. Stop playing if the app
        // is currently playing a song
    }
}

Note: To minimize driver distractions, immediately initiate audio content playback in the onPlayFromSearch() method when you have generated the audio content queue based on the user's request.

For a more detailed example on how to implement voice search to play audio content in your app, see the Universal Media Player sample.

Implement playback control actions

To provide a hands-free experience while users drive and listen to audio content in Android Auto, your app should allow users to control audio content playback with voice actions. When users speak commands such as “Next song”, “Pause music”, or “Resume music”, the system triggers the corresponding callback method where you implement the playback control action.

To provide voice-enabled playback controls, first enable the hardware controls by setting these flags in your app’s MediaSession object:

Kotlin

session.setFlags(MediaSessionCompat.FLAG_HANDLES_MEDIA_BUTTONS
        or MediaSessionCompat.FLAG_HANDLES_TRANSPORT_CONTROLS
)

Java

session.setFlags(MediaSessionCompat.FLAG_HANDLES_MEDIA_BUTTONS |
    MediaSessionCompat.FLAG_HANDLES_TRANSPORT_CONTROLS);

Then, implement the callback methods with the playback controls that you support in your app. Here’s a list of voice-enabled playback controls supported by Android Auto:

Example phrase Callback method
"Next song" onSkipToNext()
"Previous song" onSkipToPrevious()
"Pause music" onPause()
"Stop music" onStop()
"Resume music" onPlay()

For a more detailed example on how to implement voice-enabled playback actions in your app, see the Universal Media Player sample.

Handle errors

When the app experiences an error, you should set the playback state to STATE_ERROR and provide an error messages using setErrorMessage(). Android Auto will display the error message to the user.

For more information about error states, see Working with a media session: States and errors.

You should provide a message that indicates the user needs to open the media app (for example, use "Sign in to Universal Music Player", as opposed to "Please sign in"). If you need to declare a message that is different in cars, detect if the device is in car mode (see Detect car mode for an example code snippet).

See also