docs.unity3d.com
Search Results for

    Show / Hide Table of Contents

    Enum DeviceType

    Types of devices that Inference Engine uses to run inference on a neural network.

    Namespace: Unity.InferenceEngine
    Assembly: Unity.InferenceEngine.dll
    Syntax
    [MovedFrom("Unity.Sentis")]
    public enum DeviceType
    Remarks

    Inference Engine can run inference on GPU or on CPU. The performance depends on the size of the model and on the type of problem to solve. Smaller models may run faster on CPU.

    Examples

    When creating a Worker, specify which device to use for model inference.

    public ModelAsset model;
    Worker worker = new Worker(ModelLoader.Load(model), DeviceType.GPU);

    Fields

    Name Description
    CPU

    Use CPU to run model inference.

    GPU

    Use GPU to run model inference.

    In This Article
    Back to top
    Copyright © 2025 Unity Technologies — Trademarks and terms of use
    • Legal
    • Privacy Policy
    • Cookie Policy
    • Do Not Sell or Share My Personal Information
    • Your Privacy Choices (Cookie Settings)