docs.unity3d.com
Search Results for

    Show / Hide Table of Contents

    Inference Engine overview

    Inference Engine is a neural network inference library for Unity. It lets you import trained neural network models into Unity and run them in real-time with your target device’s compute resources, such as central processing unit (CPU) or graphics processing unit (GPU).

    Inference Engine supports real-time applications across all Unity-supported platforms.

    The package is officially released and available to all Unity users through the Package Manager.

    Tip

    Prior experience with machine learning frameworks like TensorFlow or PyTorch is helpful, but not required. It can make it easier to understand how to work with models in Inference Engine.

    Section Description
    Get started Learn how to install Inference Engine, explore sample projects, and understand the Inference Engine workflow.
    Create a model Create a runtime model by importing an ONNX model file or using the Inference Engine model API.
    Run a model Create input data for a model, create an engine to run the model, and get output.
    Use Tensors Learn how to get, set, and modify input and output data.
    Profile a model Use Unity tools to profile the speed and performance of a model.

    Supported platforms

    Inference Engine supports all Unity runtime platforms.

    Performance might vary based on:

    • Model operators and complexity

    • Hardware and software platform constraints of your device

    • Type of engine used

      For more information, refer to Models and Create an engine.

    Supported model types

    Inference Engine supports most models in Open Neural Network Exchange (ONNX) format with an opset version between 7 and 15. For more information, refer to Supported models and Supported ONNX operators.

    Places to find pre-trained models

    There are various sources to find pre-trained models, which might either be available in the ONNX format or in a format that you can convert. Examples include the following:

    • Hugging Face
    • Kaggle Models (Formerly TensorFlow Hub)
    • PyTorch Hub
    • Model Zoo
    • XetData
    • Meta Research

    If you want to train your own models, refer to the following links:

    • Google Colab
    • Kaggle

    Additional resources

    • Sample scripts
    • Unity Discussions group
    • Understand the Inference Engine workflow
    • Understand models in Inference Engine
    • Tensor fundamentals in Inference Engine
    • The AI menu in Unity Editor
    • Unity Dashboard AI settings
    In This Article
    Back to top
    Copyright © 2025 Unity Technologies — Trademarks and terms of use
    • Legal
    • Privacy Policy
    • Cookie Policy
    • Do Not Sell or Share My Personal Information
    • Your Privacy Choices (Cookie Settings)