InferenceInput
class InferenceInput
| kotlin.Any | |
| ↳ | android.adservices.ondevicepersonalization.InferenceInput | 
Contains all the information needed for a run of model inference. The input of android.adservices.ondevicepersonalization.ModelManager#run.
Summary
| Nested classes | |
|---|---|
| A builder for  | |
| Public methods | |
|---|---|
| Boolean | Indicates whether some other object is "equal to" this one. | 
| Int | The number of input examples. | 
| ByteArray | getData()A byte array that holds input data. | 
| InferenceOutput | The empty InferenceOutput representing the expected output structure. | 
| Array<Any!> | Note: use  | 
| InferenceInput.Params | The configuration that controls runtime interpreter behavior. | 
| Int | hashCode() | 
Public methods
equals
fun equals(other: Any?): Boolean
Indicates whether some other object is "equal to" this one.
 The equals method implements an equivalence relation on non-null object references: 
- It is reflexive: for any non-null reference value x,x.equals(x)should returntrue.
- It is symmetric: for any non-null reference values xandy,x.equals(y)should returntrueif and only ify.equals(x)returnstrue.
- It is transitive: for any non-null reference values x,y, andz, ifx.equals(y)returnstrueandy.equals(z)returnstrue, thenx.equals(z)should returntrue.
- It is consistent: for any non-null reference values xandy, multiple invocations ofx.equals(y)consistently returntrueor consistently returnfalse, provided no information used inequalscomparisons on the objects is modified.
- For any non-null reference value x,x.equals(null)should returnfalse.
An equivalence relation partitions the elements it operates on into equivalence classes; all the members of an equivalence class are equal to each other. Members of an equivalence class are substitutable for each other, at least for some purposes.
| Parameters | |
|---|---|
| obj | the reference object with which to compare. | 
| o | This value may be null. | 
| Return | |
|---|---|
| Boolean | trueif this object is the same as the obj argument;falseotherwise. | 
getBatchSize
fun getBatchSize(): Int
The number of input examples. Adopter can set this field to run batching inference. The batch size is 1 by default. The batch size should match the input data size.
getData
fun getData(): ByteArray
A byte array that holds input data. The inputs should be in the same order as inputs of the model.
For LiteRT, this field is mapped to inputs of runForMultipleInputsOutputs: https://www.tensorflow.org/lite/api_docs/java/org/tensorflow/lite/InterpreterApi#parameters_9
<code>String[] input0 = {"foo", "bar"}; // string tensor shape is [2]. int[] input1 = new int[]{3, 2, 1}; // int tensor shape is [3]. Object[] inputData = {input0, input1, ...}; byte[] data = serializeObject(inputData); </code>
For Executorch model, this field is a serialized EValue array.
| Return | |
|---|---|
| ByteArray | This value cannot be null. | 
getExpectedOutputStructure
fun getExpectedOutputStructure(): InferenceOutput
The empty InferenceOutput representing the expected output structure. For LiteRT, the inference code will verify whether this expected output structure matches model output signature.
If a model produce string tensors:
<code>String[] output = new String[3][2]; // Output tensor shape is [3, 2]. HashMap<Integer, Object> outputs = new HashMap<>(); outputs.put(0, output); expectedOutputStructure = new InferenceOutput.Builder().setDataOutputs(outputs).build(); </code>
| Return | |
|---|---|
| InferenceOutput | This value cannot be null. | 
getInputData
fun getInputData(): Array<Any!>
Note: use InferenceInput.getData() instead. 
An array of input data. The inputs should be in the same order as inputs of the model.
For example, if a model takes multiple inputs:
<code>String[] input0 = {"foo", "bar"}; // string tensor shape is [2]. int[] input1 = new int[]{3, 2, 1}; // int tensor shape is [3]. Object[] inputData = {input0, input1, ...}; </code>
| Return | |
|---|---|
| Array<Any!> | This value cannot be null. | 
getParams
fun getParams(): InferenceInput.Params
The configuration that controls runtime interpreter behavior.
| Return | |
|---|---|
| InferenceInput.Params | This value cannot be null. | 
