Builder
class Builder
kotlin.Any | |
↳ | android.adservices.ondevicepersonalization.InferenceInput.Builder |
A builder for InferenceInput
Summary
Public constructors | |
---|---|
Builder(params: InferenceInput.Params, inputData: Array<Any!>, expectedOutputStructure: InferenceOutput) Creates a new Builder. |
Public methods | |
---|---|
InferenceInput |
build() Builds the instance. |
InferenceInput.Builder |
setBatchSize(value: Int) The number of input examples. |
InferenceInput.Builder |
The empty InferenceOutput representing the expected output structure. |
InferenceInput.Builder |
setInputData(vararg value: Any!) An array of input data. |
InferenceInput.Builder |
setParams(value: InferenceInput.Params) The configuration that controls runtime interpreter behavior. |
Public constructors
Builder
Builder(
params: InferenceInput.Params,
inputData: Array<Any!>,
expectedOutputStructure: InferenceOutput)
Creates a new Builder.
Parameters | |
---|---|
params |
InferenceInput.Params: The configuration that controls runtime interpreter behavior. This value cannot be null . |
inputData |
Array<Any!>: An array of input data. The inputs should be in the same order as inputs of the model.
For example, if a model takes multiple inputs: <code>String[] input0 = {"foo", "bar"}; // string tensor shape is [2]. int[] input1 = new int[]{3, 2, 1}; // int tensor shape is [3]. Object[] inputData = {input0, input1, ...}; </code>For TFLite, this field is mapped to inputs of runForMultipleInputsOutputs: https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/InterpreterApi#parameters_9 This value cannot be null . |
expectedOutputStructure |
InferenceOutput: The empty InferenceOutput representing the expected output structure. For TFLite, the inference code will verify whether this expected output structure matches model output signature.
If a model produce string tensors: <code>String[] output = new String[3][2]; // Output tensor shape is [3, 2]. HashMap<Integer, Object> outputs = new HashMap<>(); outputs.put(0, output); expectedOutputStructure = new InferenceOutput.Builder().setDataOutputs(outputs).build(); </code>This value cannot be null . |
Public methods
build
fun build(): InferenceInput
Builds the instance.
Return | |
---|---|
InferenceInput |
This value cannot be null . |
setBatchSize
fun setBatchSize(value: Int): InferenceInput.Builder
The number of input examples. Adopter can set this field to run batching inference. The batch size is 1 by default. The batch size should match the input data size.
Return | |
---|---|
InferenceInput.Builder |
This value cannot be null . |
setExpectedOutputStructure
fun setExpectedOutputStructure(value: InferenceOutput): InferenceInput.Builder
The empty InferenceOutput representing the expected output structure. For TFLite, the inference code will verify whether this expected output structure matches model output signature.
If a model produce string tensors:
<code>String[] output = new String[3][2]; // Output tensor shape is [3, 2]. HashMap<Integer, Object> outputs = new HashMap<>(); outputs.put(0, output); expectedOutputStructure = new InferenceOutput.Builder().setDataOutputs(outputs).build(); </code>
Parameters | |
---|---|
value |
InferenceOutput: This value cannot be null . |
Return | |
---|---|
InferenceInput.Builder |
This value cannot be null . |
setInputData
fun setInputData(vararg value: Any!): InferenceInput.Builder
An array of input data. The inputs should be in the same order as inputs of the model.
For example, if a model takes multiple inputs:
<code>String[] input0 = {"foo", "bar"}; // string tensor shape is [2]. int[] input1 = new int[]{3, 2, 1}; // int tensor shape is [3]. Object[] inputData = {input0, input1, ...}; </code>For TFLite, this field is mapped to inputs of runForMultipleInputsOutputs: https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/InterpreterApi#parameters_9
Parameters | |
---|---|
value |
Any!: This value cannot be null . |
Return | |
---|---|
InferenceInput.Builder |
This value cannot be null . |
setParams
fun setParams(value: InferenceInput.Params): InferenceInput.Builder
The configuration that controls runtime interpreter behavior.
Parameters | |
---|---|
value |
InferenceInput.Params: This value cannot be null . |
Return | |
---|---|
InferenceInput.Builder |
This value cannot be null . |