Builder
class Builder
kotlin.Any | |
↳ | android.adservices.ondevicepersonalization.InferenceInput.Params.Builder |
A builder for Params
Summary
Public constructors | |
---|---|
Builder(keyValueStore: KeyValueStore, modelKey: String) Creates a new Builder. |
Public methods | |
---|---|
InferenceInput.Params |
build() Builds the instance. |
InferenceInput.Params.Builder |
setDelegateType(value: Int) The delegate to run model inference. |
InferenceInput.Params.Builder |
setKeyValueStore(value: KeyValueStore) A |
InferenceInput.Params.Builder |
setModelKey(value: String) The key of the table where the corresponding value stores a pre-trained model. |
InferenceInput.Params.Builder |
setModelType(value: Int) The type of the pre-trained model. |
InferenceInput.Params.Builder |
setRecommendedNumThreads(value: Int) The number of threads used for intraop parallelism on CPU, must be positive number. |
Public constructors
Builder
Builder(
keyValueStore: KeyValueStore,
modelKey: String)
Creates a new Builder.
Parameters | |
---|---|
keyValueStore |
KeyValueStore: A KeyValueStore where pre-trained model is stored. Only supports TFLite model now. This value cannot be null . |
modelKey |
String: The key of the table where the corresponding value stores a pre-trained model. Only supports TFLite model now. This value cannot be null . |
Public methods
build
fun build(): InferenceInput.Params
Builds the instance.
Return | |
---|---|
InferenceInput.Params |
This value cannot be null . |
setDelegateType
fun setDelegateType(value: Int): InferenceInput.Params.Builder
The delegate to run model inference. If not set, the default value is DELEGATE_CPU
.
Parameters | |
---|---|
value |
Int: Value is android.adservices.ondevicepersonalization.InferenceInput.Params#DELEGATE_CPU |
Return | |
---|---|
InferenceInput.Params.Builder |
This value cannot be null . |
setKeyValueStore
fun setKeyValueStore(value: KeyValueStore): InferenceInput.Params.Builder
A KeyValueStore
where pre-trained model is stored. Only supports TFLite model now.
Parameters | |
---|---|
value |
KeyValueStore: This value cannot be null . |
Return | |
---|---|
InferenceInput.Params.Builder |
This value cannot be null . |
setModelKey
fun setModelKey(value: String): InferenceInput.Params.Builder
The key of the table where the corresponding value stores a pre-trained model. Only supports TFLite model now.
Parameters | |
---|---|
value |
String: This value cannot be null . |
Return | |
---|---|
InferenceInput.Params.Builder |
This value cannot be null . |
setModelType
fun setModelType(value: Int): InferenceInput.Params.Builder
The type of the pre-trained model. If not set, the default value is MODEL_TYPE_TENSORFLOW_LITE
. Only supports MODEL_TYPE_TENSORFLOW_LITE
for now.
Parameters | |
---|---|
value |
Int: Value is android.adservices.ondevicepersonalization.InferenceInput.Params#MODEL_TYPE_TENSORFLOW_LITE |
Return | |
---|---|
InferenceInput.Params.Builder |
This value cannot be null . |
setRecommendedNumThreads
fun setRecommendedNumThreads(value: Int): InferenceInput.Params.Builder
The number of threads used for intraop parallelism on CPU, must be positive number. Adopters can set this field based on model architecture. The actual thread number depends on system resources and other constraints.
Parameters | |
---|---|
value |
Int: Value is 1 or greater |
Return | |
---|---|
InferenceInput.Params.Builder |
This value cannot be null . |