Metrics are the main type of information extracted from your benchmarks. They
are passed to the measureRepeated
function as a List
, which lets you specify
multiple measured metrics at once. At least one type of metric is required for
the benchmark to run.
The following snippet captures frame timing and custom trace section metrics.
Kotlin
benchmarkRule.measureRepeated( packageName = TARGET_PACKAGE, metrics = listOf( FrameTimingMetric(), TraceSectionMetric("RV CreateView"), TraceSectionMetric("RV OnBindView"), ), // ... )
Java
benchmarkRule.measureRepeated( /* packageName */ TARGET_PACKAGE, /* metrics */ Arrays.asList( new StartupTimingMetric(), new TraceSectionMetric("RV CreateView"), new TraceSectionMetric("RV OnBindView"), ), /* iterations */ 5, // ... );
Benchmark results are output to Android Studio, as shown in the following image. If multiple metrics are defined, all of them are combined in the output.
StartupTimingMetric
StartupTimingMetric
captures app startup timing metrics with these
values:
timeToInitialDisplayMs
– Time from the system receiving a launch intent until rendering the first frame of the destination Activity.timeToFullDisplayMs
– Time from the system receiving a launch intent until the application reports fully drawn viareportFullyDrawn
method. The measurement stops at the completion of rendering the first frame after (or containing) thereportFullyDrawn()
call. This measurement may not be available on Android 10 (API level 29) and lower.
For more information about what contributes to application startup time, check the app startup time page.
FrameTimingMetric
FrameTimingMetric
captures timing information from frames produced by a
benchmark, such as a scrolling or animation and outputs the following values:
frameOverrunMs
– How much time a given frame missed its deadline by. Positive numbers indicate a dropped frame and visible jank / stutter, negative numbers indicate how much faster than the deadline a frame was. Available only on Android 12 (API level 31) and higher.frameDurationCpuMs
– How much time the frame took to be produced on the CPU – on both the UI Thread, and RenderThread.
These measurements are collected in distribution: 50th, 90th, 95th, and 99th percentile.
For more information on how to identify and improve slow frames, see Slow rendering.
TraceSectionMetric (experimental)
TraceSectionMetric
captures the time taken by a trace section matching
the provided sectionName
and outputs min, median and maximum time in
milliseconds. The trace section is defined either by function call
trace(sectionName){}
or the code between Trace.beginSection(sectionName)
and
Trace.endSection()
(or their async variants). It always selects the first
instance of a trace section captured during a measurement.
For more information on tracing, see Overview of system tracing and Define custom events.
PowerMetric (experimental)
PowerMetric
captures
the change in power or energy over the duration of your test for the provided
categories
. Each
selected category is broken down into its measurable subcomponents, whereas
unselected categories are added to the "unselected" metric. Note that metrics
measure system-wide consumption, not the consumption on a per-app basis, and are
currently limited to Pixel 6 and Pixel 6 Pro devices.
power<category>Uw
- The amount of power consumed over the duration of your test in this category.energy<category>Uws
- The amount of energy transferred per unit of time for the duration of your test in this category.
Categories include: CPU, DISPLAY, GPU, GPS, MEMORY, MACHINE_LEARNING, NETWORK, and UNCATEGORIZED.
With some categories, like CPU
, it may be hard to separate work done by other
processes from work done by your own app. Try to minimize the interference by
removing or restricting unnecessary apps and accounts.