Enum DeviceType
Types of devices that Inference Engine uses to run inference on a neural network.
Namespace: Unity.InferenceEngine
Assembly: Unity.InferenceEngine.dll
Syntax
[MovedFrom("Unity.Sentis")]
public enum DeviceType
Remarks
Inference Engine can run inference on GPU or on CPU. The performance depends on the size of the model and on the type of problem to solve. Smaller models may run faster on CPU.
Examples
When creating a Worker, specify which device to use for model inference.
public ModelAsset model;
Worker worker = new Worker(ModelLoader.Load(model), DeviceType.GPU);
Fields
Name | Description |
---|---|
CPU | Use CPU to run model inference. |
GPU | Use GPU to run model inference. |