Added in API level 35
ModelManager
open class ModelManager
| kotlin.Any | |
| ↳ | android.adservices.ondevicepersonalization.ModelManager |
Handles model inference and only support TFLite model inference now. See android.adservices.ondevicepersonalization.IsolatedService#getModelManager.
Summary
| Public methods | |
|---|---|
| open Unit |
run(input: InferenceInput, executor: Executor, receiver: OutcomeReceiver<InferenceOutput!, Exception!>)Run a single model inference. |
Public methods
run
Added in API level 35
open fun run(
input: InferenceInput,
executor: Executor,
receiver: OutcomeReceiver<InferenceOutput!, Exception!>
): Unit
Run a single model inference. Only supports TFLite model inference now.
This method may take several seconds to complete, so it should only be called from a worker thread.
| Parameters | |
|---|---|
input |
InferenceInput: contains all the information needed for a run of model inference. This value cannot be null. |
executor |
Executor: the Executor on which to invoke the callback. This value cannot be null. Callback and listener events are dispatched through this Executor, providing an easy way to control which thread is used. To dispatch events through the main thread of your application, you can use Context.getMainExecutor(). Otherwise, provide an Executor that dispatches to an appropriate thread. |
receiver |
OutcomeReceiver<InferenceOutput!, Exception!>: this returns a InferenceOutput which contains model inference result or Exception on failure. This value cannot be null. |