Added in API level 35

Builder


class Builder
kotlin.Any
   ↳ android.adservices.ondevicepersonalization.InferenceInput.Builder

A builder for InferenceInput

Summary

Public constructors
Builder(params: InferenceInput.Params, inputData: Array<Any!>, expectedOutputStructure: InferenceOutput)

Note: use android.

Public methods
InferenceInput

Builds the instance.

InferenceInput.Builder
setBatchSize(value: Int)

The number of input examples.

InferenceInput.Builder

The empty InferenceOutput representing the expected output structure.

InferenceInput.Builder
setInputData(vararg value: Any!)

Note: use android.

InferenceInput.Builder

The configuration that controls runtime interpreter behavior.

Public constructors

Builder

Added in API level 35
Builder(
    params: InferenceInput.Params,
    inputData: Array<Any!>,
    expectedOutputStructure: InferenceOutput)

Note: use android.adservices.ondevicepersonalization.InferenceInput.Builder#Builder(android.adservices.ondevicepersonalization.InferenceInput.Params,byte[]) instead.

Creates a new Builder for LiteRT model inference input. For LiteRT, inputData field is mapped to inputs of runForMultipleInputsOutputs: https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/InterpreterApi#parameters_9 The inputs should be in the same order as inputs * of the model. *

For example, if a model takes multiple inputs: *

<code>String[] input0 = {"foo", "bar"}; // string tensor shape is [2].
  int[] input1 = new int[]{3, 2, 1}; // int tensor shape is [3].
  Object[] inputData = {input0, input1, ...};
  </code>
For LiteRT, the inference code will verify whether the expected output structure matches model output signature.

If a model produce string tensors:

<code>String[] output = new String[3][2];  // Output tensor shape is [3, 2].
  HashMap&lt;Integer, Object&gt; outputs = new HashMap&lt;&gt;();
  outputs.put(0, output);
  expectedOutputStructure = new InferenceOutput.Builder().setDataOutputs(outputs).build();
 
  </code>
Parameters
params InferenceInput.Params: configuration that controls runtime interpreter behavior. This value cannot be null.
inputData Array<Any!>: an array of input data. This value cannot be null.
expectedOutputStructure InferenceOutput: an empty InferenceOutput representing the expected output structure. This value cannot be null.

Public methods

build

Added in API level 35
fun build(): InferenceInput

Builds the instance. This builder should not be touched after calling this!

Return
InferenceInput This value cannot be null.

setBatchSize

Added in API level 35
fun setBatchSize(value: Int): InferenceInput.Builder

The number of input examples. Adopter can set this field to run batching inference. The batch size is 1 by default. The batch size should match the input data size.

Return
InferenceInput.Builder This value cannot be null.

setExpectedOutputStructure

Added in API level 35
fun setExpectedOutputStructure(value: InferenceOutput): InferenceInput.Builder

The empty InferenceOutput representing the expected output structure. It's only required by LiteRT model. For LiteRT, the inference code will verify whether this expected output structure matches model output signature.

If a model produce string tensors:

<code>String[] output = new String[3][2];  // Output tensor shape is [3, 2].
  HashMap&lt;Integer, Object&gt; outputs = new HashMap&lt;&gt;();
  outputs.put(0, output);
  expectedOutputStructure = new InferenceOutput.Builder().setDataOutputs(outputs).build();
  </code>
Parameters
value InferenceOutput: This value cannot be null.
Return
InferenceInput.Builder This value cannot be null.

setInputData

Added in API level 35
fun setInputData(vararg value: Any!): InferenceInput.Builder

Note: use android.adservices.ondevicepersonalization.InferenceInput.Builder#setInputData(byte[]) instead.

An array of input data. The inputs should be in the same order as inputs of the model.

For example, if a model takes multiple inputs:

<code>String[] input0 = {"foo", "bar"}; // string tensor shape is [2].
  int[] input1 = new int[]{3, 2, 1}; // int tensor shape is [3].
  Object[] inputData = {input0, input1, ...};
  </code>
For LiteRT, this field is mapped to inputs of runForMultipleInputsOutputs: https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/InterpreterApi#parameters_9
Parameters
value Any!: This value cannot be null.
Return
InferenceInput.Builder This value cannot be null.

setParams

Added in API level 35
fun setParams(value: InferenceInput.Params): InferenceInput.Builder

The configuration that controls runtime interpreter behavior.

Parameters
value InferenceInput.Params: This value cannot be null.
Return
InferenceInput.Builder This value cannot be null.