Microbenchmark Instrumentation Arguments

Stay organized with collections Save and categorize content based on your preferences.

Configure the behavior of your benchmarks by specifying different arguments to the instrumentation runner. These can either be applied to your Gradle configuration or added directly when running instrumentation from the command line.


Configures where JSON benchmark reports and profiling results are saved on device.

  • Argument type: file path string
  • Defaults to: test APK's external directory


Allows running benchmarks in single loop to verify they work properly. It can be used with regular tests as part of verification.

  • Argument type: boolean
  • Defaults to: false


Overrides time-driven target iteration counts to ensure a consistent amount of work. This is typically only useful with profiling enabled (see Profiling) to ensure consistent quantity of work is performed within a profiling trace, when comparing different implementations / runs. In other scenarios, this will likely reduce accuracy / stability of measurements.

  • Argument type: integer
  • Defaults to: Not specified


Enables writing the result JSON file to external storage.

  • Argument type: boolean
  • Defaults to: true


Allows capturing trace files while running the benchmarks. See Profile the benchmarks for available options.

  • Argument type: string
  • Available options:
    • MethodTracing
    • StackSampling
    • None
  • Defaults to: None


Accepts comma-separated list of errors to turn into warnings.

  • Argument type: list of strings
  • Available options:
  • Defaults to: an empty list

androidx.benchmark.startupMode.enable (Deprecated)

Reconfigures looping behavior to support benchmarking code during startup. The benchmarks are executed without warmup looping for 10 measurements. To minimize overhead in microbenchmarks, loop averaging is disabled.

  • Argument type: boolean
  • Defaults to: false