API Overview

The L Developer Preview gives you an advance look at the upcoming release for the Android platform, which offers new features for users and app developers. This document provides an introduction to the most notable APIs.

The L Developer Preview is intended for developer early adopters and testers. If you are interested in influencing the direction of the Android framework, give the L Developer Preview a try and send us your feedback!

Caution: Do not not publish apps that use the L Developer Preview to the Google Play store.

Note: This document often refers to classes and methods that do not yet have reference material available on developer.android.com. These API elements are formatted in code style in this document (without hyperlinks). For the preliminary API documentation for these elements, download the preview reference.

Important Behavior Changes

If you have previously published an app for Android, be aware that your app might be affected by changes in the upcoming release.

New Android Runtime (ART)

The 4.4 release introduced a new, experimental Android runtime, ART. Under 4.4, ART was optional, and the default runtime remained Dalvik. With the L Developer Preview, ART is now the default runtime.

For an overview of ART's new features, see Introducing ART. Some of the major new features are:

  • Ahead-of-Time (AOT) compilation
  • Improved garbage collection (GC)
  • Improved debugging support

Most Android apps should just work without change under ART. However, some techniques that work on Dalvik do not work on ART. For information about the most important issues, see Verifying App Behavior on the Android Runtime (ART). Pay particular attention if:

  • Your app uses Java Native Interface (JNI) to run C/C++ code.
  • You use development tools that generate non-standard code (such as some obfuscators).
  • You use techniques that are incompatible with compacting garbage collection. (ART does not currently implement compacting GC, but compacting GC is under development in the Android Open-Source Project.)

If your app implements notifications...

Notifications are drawn with dark text atop white (or very light) backgrounds to match the new material design widgets. Make sure that all your notifications look right with the new color scheme:

Figure 1. Fullscreen activity showing a heads-up notification

  • Update or remove assets that involve color.
  • The system automatically inverts action icons in notifications. Use android.app.Notification. Builder.setColor() to set an accent color in a circle behind your icon image.
  • The system ignores all non-alpha channels in action icons and the main notification icon. You should assume that these icons are alpha-only.

If you are currently adding sounds and vibrations to your notifications by using the Ringtone, MediaPlayer, or Vibrator classes, remove this code so that the system can present notifications correctly in Do not Disturb mode. Instead, use the Notification.Builder methods instead to add sounds and vibration.

Notifications now appear in a small floating window (also called a heads-up notification) when the device is active (that is, the device is unlocked and its screen is on). These notifications appear similar to the compact form of your notification, except that the heads-up notification also shows action buttons. Users can act on, or dismiss, a heads-up notification without leaving the current app.

Examples of conditions that may trigger heads-up notifications include:

  • The user's activity is in fullscreen mode (the app uses fullScreenIntent), or
  • The notification has high priority and uses ringtones or vibrations

If your app implements notifications under those scenarios, make sure that heads-up notifications are presented correctly.

If your app uses RemoteControlClient...

Lockscreens in the L Developer Preview do not show transport controls for your RemoteControlClient. Instead, your app can provide media playback control from the lockscreen through a notification. This gives your app more control over the presentation of media buttons, while providing a consistent experience for users across the lockscreen and unlocked device.

The L Developer Preview introduces a new android.app.Notification.MediaStyle template which is recommended for this purpose. MediaStyle converts notification actions that you added with Notification.Builder.addAction() into compact buttons embedded in your app's media playback notifications.

If you are using the new android.media.session.MediaSession class (see Media Playback Control below), attach your session token with Notification.MediaStyle.setMediaToken() to inform the system that this notification controls an ongoing media session.

Call Notification.Builder.setVisibility(Notification.VISIBILITY_PUBLIC) to mark a notification as safe to show atop any lockscreen (secure or otherwise). For more information, see Lockscreen Notifications.

If your app uses ActivityManager.getRecentTasks()...

With the introduction of the new concurrent documents and activities tasks feature in the upcoming release (see Concurrent documents and activities in Recents screen below), the ActivityManager.getRecentTasks() method is now deprecated to improve user privacy. For backward compatibility, this method still returns a small subset of its data, including the calling application’s own tasks and possibly some other non-sensitive tasks (such as Home). If your app is using this method to retrieve its own tasks, use android.app.ActivityManager.getAppTasks() instead to retrieve that information.

User Interface

Material design support

The upcoming release adds support for Android's new material design style. You can create apps with material design that are visually dynamic and have UI element transitions that feel natural to users. This support includes:

  • The material theme
  • View shadows
  • The RecyclerView widget
  • Drawable animation and styling effects
  • Material design animation and activity transition effects
  • Animators for view properties based on the state of a view
  • Customizable UI widgets and app bars with color palettes that you control

To learn more about adding material design functionality to your app, see Material Design.

Lockscreen notifications

Lockscreens in the L Developer Preview have the ability to present notifications. Users can choose via Settings whether to allow sensitive notification content to be shown over a secure lockscreen.

Your app can control the level of detail visible when its notifications are displayed over the secure lockscreen. To control the visibility level, call android.app.Notification.Builder.setVisibility() and specify one of these values:

  • VISIBILITY_PRIVATE. Shows basic information, such as the notification’s icon, but hides the notification’s full content.
  • VISIBILITY_PUBLIC. Shows the notification’s full content.
  • VISIBILITY_SECRET. Shows nothing, excluding even the notification’s icon.

When VISIBILITY_PRIVATE is set, you can also provide a redacted version of the notification content that hides personal details. For example, an SMS app might display a notification that shows "You have 3 new text messages." but hides the message content and senders. To provide this alternative notification, first create the replacement notification using Notification.Builder. When you create the private notification object, attach the replacement notification to it through the Notification.Builder.setPublicVersion() method.

Notifications metadata

The L Developer Preview uses metadata associated with your app notifications to sort the notifications more intelligently. To set the metadata, call the following methods in android.app.Notification.Builder when you construct the notification:

  • setCategory(). Depending on the message category, this tells the system how to handle your app notifications when the device is in Do not Disturb mode (for example, if your notification represents an incoming call, instant message, or alarm).
  • setPriority(). Notifications with the priority field set to PRIORITY_MAX or PRIORITY_HIGH will appear in a small floating window if the notification also has sound or vibration.
  • addPerson(). Allows you to add a list of people to a notification. Your app can use this to signal to the system that it should group together notifications from the specified people, or rank notifications from these people as being more important.

Concurrent documents and activities in the Recents screen

In previous releases, the Recents screen could only display a single task for each app that the user interacted with most recently. Now your app can open more tasks as needed for additional concurrent activities for documents. This feature facilitates multitasking by letting users quickly switch between individual activities and documents from the Recents screen, with a consistent switching experience across all apps. Examples of such concurrent tasks might include open tabs in a web browser app, documents in a productivity app, concurrent matches in a game, or chats in a messaging app. Your app can manage its tasks through the android.app.ActivityManager.AppTask class.

To insert a logical break so that the system treats your activity as a new task, use android.content.Intent.FLAG_ACTIVITY_NEW_DOCUMENT when launching the activity with startActivity(). You can also get this behavior by declaring the <activity> attribute documentLaunchMode="intoExisting" or ="always" in your manifest.

You can also mark that a task should be removed from the Recents screen when all its activities are closed. To do this, use android.content.Intent.FLAG_ACTIVITY_AUTO_REMOVE_FROM_RECENTS when starting the root activity for the task. You can also set this behavior for an activity by declaring the <activity> attribute autoRemoveFromRecents=“true” in your manifest.

To avoid cluttering the Recents screen, you can set the maximum number of tasks from your app that can appear in that screen. To do this, set the <application> attribute android:maxRecent. The current maximum that can be specified is 100 tasks per user.

WebView updates

The L Developer Preview updates the WebView implementation to Chromium M36, bringing security and stability enhancements, as well as bug fixes. The default user-agent string for a WebView running on the L Developer Preview has been updated to incorporate 36.0.0.0 as the version number.

Additionally, this release brings support for the WebAudio, WebGL, and WebRTC open standards. To learn more about the new features included in this release, see WebView for Android.

Graphics

Support for OpenGL ES 3.1

The L Developer Preview adds Java interfaces and native support for OpenGL ES 3.1. Key new functionality provided in OpenGL ES 3.1 includes:

  • Compute shaders
  • Separate shader objects
  • Indirect draw commands
  • Multisample and stencil textures
  • Shading language improvements
  • Extensions for advanced blend modes and debugging
  • Backward compatibility with OpenGL ES 2.0 and 3.0

The Java interface for OpenGL ES 3.1 on Android is provided with GLES31. When using OpenGL ES 3.1, be sure that you declare it in your manifest file with the <uses-feature> tag and the android:glEsVversion attribute. For example:

<manifest>
    <uses-feature android:glEsVersion="0x00030001" />
    ...
</manifest>

For more information about using OpenGL ES, including how to check the device’s supported OpenGL ES version at runtime, see the OpenGL ES API guide.

Android Extension Pack

In addition to OpenGL ES 3.1, this release provides an extension pack with Java interfaces and native support for advanced graphics functionality. These extensions are treated as a single package by Android. (If the ANDROID_extension_pack_es31 extension is present, your app can assume all extensions in the package are present and enable the shading language features with a single #extension statement.

The extension pack supports:

  • Guaranteed fragment shader support for shader storage buffers, images, and atomics (fragment shader support is optional in OpenGL ES 3.1.)
  • Tessellation and geometry shaders
  • ASTC (LDR) texture compression format
  • Per-sample interpolation and shading
  • Different blend modes for each color attachment in a frame buffer

The Java interface for the extension pack is provided with GLES31Ext. In your app manifest, you can declare that support for the extension pack is required, with the <uses-feature> tag, but the precise syntax is not finalized in the L Developer Preview.

Multimedia

Camera API for advanced camera capabilities

The L Developer Preview introduces the new android.hardware.camera2 API to facilitate fine-grain photo capture and image processing. You can now programmatically access the camera devices available to the system with CameraManager.getCameraIdList() and connect to a specific device with CameraManager.openCamera(). To start capturing images, create a CameraCaptureSession and specify the Surface objects for the captured images. The CameraCaptureSession can be configured to take single shots or multiple images in a burst.

To be notified when new images are captured, implement the CameraCaptureSession.CaptureListener() interface and set it in your capture request. Now when the system completes the image capture request, your CameraCaptureSession.CaptureListener() receives a call to onCaptureCompleted(), providing you with the image capture metadata in a CaptureResult.

To see an example of how to use the updated Camera API, refer to the Camera2Basic and Camera2Video implementation samples in this release.

Audio playback

This release includes the following changes to AudioTrack:

  • Your app can now supply audio data in floating-point format (android.media.AudioFormat.ENCODING_PCM_FLOAT). This permits greater dynamic range, more consistent precision, and greater headroom. Floating-point arithmetic is especially useful during intermediate calculations. Playback end-points use integer format for audio data, and with lower bit-depth. (In the L Developer Preview, portions of the internal pipeline are not yet floating-point.)
  • Your app can now supply audio data as a ByteBuffer, in the same format as provided by MediaCodec.
  • The WRITE_NON_BLOCKING option can simplify buffering and multithreading for some apps.

Media playback control

You can now build your own media controller app with the new android.media.session.MediaController class, which provides simplified transport controls APIs that replace those in RemoteControlClient. The MediaController class allows thread-safe control of playback from a non-UI process, making it easier to control your media playback service from your app’s user interface.

You can also create multiple controllers to send playback commands, media keys, and other events to the same ongoing android.media.session.MediaSession. When you add a controller, you must call MediaSession.getSessionToken() to request an access token in order for your app to interact with the session.

You can now send transport commands such as "play", "stop", "skip", and "set rating" by using MediaController.TransportControls. To handle in-bound media transport commands from controllers attached to the session, override the callback methods in MediaSession.TransportControlsCallback.

You can also create rich notifications that allow playback control tied to a media session with the new android.app.Notification.MediaStyle class. By using the new notification and media APIs, you will ensure that the System UI knows about your playback and can extract and show album art.

Storage

Directory selection

The L Developer Preview extends the Storage Access Framework to let users select an entire directory subtree, giving apps read/write access to all contained documents without requiring user confirmation for each item.

To select a directory subtree, build and send an android.intent.action.OPEN_DOCUMENT_TREE Intent. The system displays all DocumentsProvider instances that support subtree selection, letting the user browse and select a directory. The returned URI represents access to the selected subtree. You can then use DocumentsContract.buildChildDocumentsUriUsingTree() and DocumentsContract.buildDocumentUriUsingTree() along with ContentResolver.query() to explore the subtree.

The new DocumentsContract.createDocument() method lets you create new documents or directories anywhere under the subtree. To manage existing documents, use DocumentsContract.renameDocument() and DocumentsContract.deleteDocument(). Check DocumentsContract.Document.COLUMN_FLAGS to verify provider support for these calls before issuing them.

If you're implementing a DocumentsProvider and want to support subtree selection, implement DocumentsProvider.isChildDocument() and include Documents.Contract.FLAG_SUPPORTS_IS_CHILD in your Root.COLUMN_FLAGS.

The L Developer Preview also introduces new package-specific directories on shared storage where your app can place media files for inclusion in MediaStore. The new android.content.Context.getExternalMediaDirs() returns paths to these directories on all shared storage devices. Similarly to Context.getExternalFilesDir(), no additional permissions are needed by your app to access the returned paths. The platform periodically scans for new media in these directories, but you can also use MediaScannerConnection to explicitly scan for new content.

Wireless & Connectivity

Multiple network connections

The L Developer Preview provides new multi-networking APIs. These let your app dynamically scan for available networks with specific capabilities, and establish a connection to them. This is useful when your app requires a specialized network, such as an SUPL, MMS, or carrier-billing network, or if you want to send data using a particular type of transport protocol.

To select and connect to a network dynamically from your app follow these steps:

  1. Create a ConnectivityManager.
  2. Create a android.net.NetworkRequest to specify the network features and transport type your app is interested in.
  3. To scan for suitable networks, call ConnectivityManager.requestNetwork() or ConnectivityManager.registerNetworkCallback(), and pass in the NetworkRequest object and an implementation of ConnectivityManager.NetworkCallbackListener.

When the system detects a suitable network, it connects to the network and invokes the NetworkCallbackListener.onAvailable() callback. You can use the android.net.Network object from the callback to get additional information about the network, or to direct traffic to use the selected network.

Bluetooth broadcasting

Android 4.3 introduced platform support for Bluetooth Low Energy (BLE) in the central role. In the L Developer Preview, an Android device can now act as a Bluetooth LE peripheral device. Apps can use this capability to make their presence known to nearby devices. For instance, you can build apps that allow a device to function as a pedometer or health monitor and communicate its data with another BLE device.

The new android.bluetooth.le APIs enable your apps to broadcast advertisements, scan for responses, and form connections with nearby BLE devices. You must add the android.permission.BLUETOOTH_ADMIN permission in your manifest in order for your app to use the new advertising and scanning features.

To begin Bluetooth LE advertising so that other devices can discover your app, call android.bluetooth.le.BluetoothAdvertiser.startAdvertising() and pass in an implementation of the android.bluetooth.le.AdvertiseCallback class. The callback object receives a report of the success or failure of the advertising operation.

The L Developer Preview introduces the android.bluetooth.le.ScanFilter class so that your app can scan for only the specific types of devices it is interested in. To begin scanning for Bluetooth LE devices, call android.bluetooth.le.BluetoothLeScanner.startScan() and pass in a list of filters. In the method call, you must also provide an implementation of android.bluetooth.le.ScanCallback to report if a Bluetooth LE advertisement is found.

NFC enhancements

The L Developer Preview adds these enhancements to enable wider and more flexible use of NFC:

  • Android Beam is now available in the share menu.
  • Your app can invoke the Android Beam on the user’s device to share data by calling android.nfc.NfcAdapter.invokeBeam(). This avoids the need for the user to manually tap the device against another NFC-capable device to complete the data transfer.
  • You can use the new android.nfc.NdefRecord.createTextRecord() method to create an NDEF record containing UTF-8 text data.
  • If you are developing a payment app, you now have the ability to register an NFC application ID (AID) dynamically by calling android.nfc.cardemulation.CardEmulation.registerAidsForService(). You can also use android.nfc.cardemulation.CardEmulation.setPreferredService() to set the preferred card emulation service that should be used when a specific activity is in the foreground.

Power Efficiency

Scheduling jobs

The L Developer Preview provides a new android.app.job.JobScheduler API that lets you optimize battery life by defining jobs for the system to run asynchronously at a later time or under specified conditions (such as when the device is charging). This is useful in such situations as:

  • The app has non-user-facing work that you can defer.
  • The app has work you'd prefer to do when the unit is plugged in.
  • The app has a task that requires network access (or requires a Wi-Fi connection).
  • The app has a number of tasks that you want to run as a batch on a regular schedule.

A unit of work is encapsulated by a android.app.job.JobInfo object. This object provides an exact description of the criteria to be used for scheduling.

Use the android.app.job.JobInfo.Builder to configure how the scheduled task should run. You can schedule the task to run under specific conditions, such as:

  • The device is charging
  • The device is connected to an unmetered network
  • The system deems the device to be idle
  • Completion with a minimum delay or within a specific deadline.

For example, you can add code like this to run your task on an unmetered network:

JobInfo uploadTask = new JobInfo.Builder(mJobId, mServiceComponent)
        .setRequiredNetworkCapabilities(JobInfo.NetworkType.UNMETERED)
        .build();

JobScheduler jobScheduler =
        (JobScheduler) context.getSystemService(Context.JOB_SCHEDULER_SERVICE)
jobScheduler.schedule(uploadTask);

To see an example of how to use the JobScheduler API, refer to the JobSchedulerSample implementation sample in this release.

Developer tools for power measurement

The L Developer Preview provides several new developer tools and APIs to help you better measure and understand your app's power usage.

batterystats

The dumpsys batterystats command allows you to generate interesting statistical data about battery usage on a device, organized by unique user ID (UID). The statistics generated by the tool include:

  • History of battery related events
  • Global statistics for the device
  • Approximated power use per UID and system component
  • Per-app mobile ms per packet
  • System UID aggregated statistics
  • App UID aggregated statistics

Use the --help option to learn about the various options for tailoring the output. For example, to print battery usage statistics for a given app package since the device was last charged, run this command:

$ adb shell dumpsys batterystats --charged <package-name>
Battery Historian

The Battery Historian tool (historian.par) analyzes Android bug reports from the L Developer Preview and creates an HTML visualization of power-related events. It can also visualize power consumption data from a power monitor, and attempts to map power usage to the wake locks seen. You can find the Battery Historian tool in <sdk>/tools.

Figure 2.HTML visualization generated by the Battery Historian tool.

For best results, you should first enable full wake lock reporting, to allow the Battery Historian tool to monitor uninterrupted over an extended period of time:

$ adb shell dumpsys batterystats --enable full-wake-history

You should also reset battery statistics at the beginning of a measurement:

$ adb shell dumpsys batterystats --reset

To generate an HTML visualization:

$ historian.par [-p powerfile] bugreport.txt > out.html

Enterprise

Managed provisioning

Figure 3. Launcher screen showing managed apps (marked with a lock badge)

The L Developer Preview provides new functionality for running apps within an enterprise environment. A device administrator can initiate a managed provisioning process to add a co-present but separate managed profile to a device, if the user has an existing personal account. Apps that are associated with managed profiles will appear alongside non-managed apps in the user’s Launcher, Recent apps screen, and notifications.

To start the managed provisioning process, send ACTION_PROVISION_MANAGED_PROFILE in an Intent. If the call is successful, the system triggers the android.app.admin.DeviceAdminReceiver. onProfileProvisioningComplete() callback. You can then call app.admin.DevicePolicyManager. setProfileEnabled() to enable this managed profile.

If you are developing a Launcher app, you can use the new android.content.pm.LauncherApps class to get a list of launchable activities for the current user and any associated managed profiles. Your Launcher can make the managed apps visually prominent by appending a “work” badge to the icon drawable with android.os.UserManager. getBadgeDrawableForUser().

To see an example of how to use the new functionality, refer to the BasicManagedProfile implementation sample in this release.

Task locking

The L Developer Preview introduces a new task locking API that lets you temporarily restrict users from leaving your app or being interrupted by notifications. This could be used, for example, if you are developing an education app to support high stakes assessment requirements on Android. Once your app activates this mode, users will not be able to see notifications, access other apps, or return to the Home screen, until your app exits the mode.

To prevent unauthorized usage, only authorized apps can activate task locking. Furthermore, task locking authorization must be granted by a specially-configured device owner app, through the android.app.admin.DevicePolicyManager.setLockTaskComponents() method.

To set up a device owner, follow these steps:

  1. Attach a device running an Android userdebug build to your development machine.
  2. Install your device owner app.
  3. Create a device_owner.xml file and save it to the /data/system directory on the device.
    $ adb root
    $ adb shell stop
    $ rm /tmp/device_owner.xml
    $ echo "<?xml version='1.0' encoding='utf-8' standalone='yes' ?>"
    >> /tmp/device_owner.xml
    $ echo "<device-owner package=\"<your_device_owner_package>\"
    name=\"*<your_organization_name>\" />" >> /tmp/device_owner.xml
    $ adb push /tmp/device_owner.xml /data/system/device_owner.xml
    $ adb reboot
    

Before using the task locking API in your app, verify that your activity is authorized by calling DevicePolicyManager.isLockTaskPermitted().

To activate task locking, call android.app.Activity.startLockTask() from your authorized activity.

When task locking is active, the following behavior takes effect:

  • The status bar is blank, and user notifications and status information is hidden.
  • The Home and Recent Apps buttons are hidden.
  • Other apps may not launch new activities.
  • The current app may start new activities, as long as doing so does not create new tasks.
  • The user remains locked on your app until an authorized activity calls Activity.stopLockTask().

Printing Framework

Render PDF as bitmap

You can now render PDF document pages into bitmap images for printing by using the new android.graphics.pdf.PdfRenderer class. You must specify a ParcelFileDescriptor that is seekable (that is, the content can be randomly accessed) on which the system writes the the printable content. Your app can obtain a page for rendering with openPage(), then call render() to turn the opened PdfRenderer.Page into a bitmap. You can also set additional parameters if you only want to convert a portion of the document into a bitmap image (for example, to implement tiled rendering in order to zoom in on the document).

Testing & Accessibility

Testing and accessibility improvements

The L Developer Preview adds the following support for testing and accessibility:

  • You can use the new android.app.UiAutomation.getWindowAnimationFrameStats() and android.app.UiAutomation.getWindowContentFrameStats() methods to capture frame statistics for window animations and content. This lets you write instrumentation tests to evaluate if the app under test is rendering frames at a sufficient refresh frequency to provide a smooth user experience.
  • You can execute shell commands from your instrumentation test with the new android.app.UiAutomation.executeShellCommand(). The command execution is similar to running adb shell from a host connected to the device. This allows you to use shell based tools such as dumpsys, am, content, and pm.
  • Accessibility services and test tools that use the accessibility APIs (such as uiautomator) can now retrieve detailed information about the properties of windows on the screen that sighted users can interact with. To retrieve a list of android.view.accessibility.AccessibilityWindowInfo objects representing the windows information, call the new android.accessibilityservice.AccessibilityService.getWindows() method.
  • You can use the new android.view.accessibility.AccessibilityNodeInfo.AccessibilityAction to define standard or customized actions to perform on an AccessibilityNodeInfo. The new AccessibilityAction class replaces the actions-related APIs previously found in AccessibilityNodeInfo.

IME

Easier switching between input languages

Beginning in the L Developer Preview, users can more easily switch between all input method editors (IME) supported by the platform. Performing the designated switching action (usually touching a Globe icon on the soft keyboard) will cycle among all such IMEs. This change takes place in InputMethodManager.shouldOfferSwitchingToNextInputMethod().

In addition, the framework now checks whether the next IME includes a switching mechanism at all (and, thus, whether that IME supports switching to the IME after it). An IME with a switching mechanism will not cycle to an IME without one. This change takes place in InputMethodManager.switchToNextInputMethod.

To see an example of how to use the updated IME-switching APIs, refer to the updated soft-keyboard implementation sample in this release.

Manifest Declarations

Declarable required features

The following values are now supported in the <uses-feature> element, so you can ensure that your app is installed only on devices that provide the features your app needs.

  • FEATURE_LEANBACK. Declares that your app must be installed only on devices that support the Android TV user interface. Example:
    <uses-feature android:name="android.software.leanback"
                  android:required="true" />
    
  • FEATURE_WEBVIEW. Declares that your app must only be installed on devices that fully implement the android.webkit.* APIs. Example:
    <uses-feature android:name="android.software.webview"
                  android:required="true" />
    

For a detailed view of all API changes in the L Developer Preview, see the API Differences Report.