Understand the Sentis workflow
To use Sentis to run a neural network in Unity, follow these steps:
- Use the
Unity.Sentis
namespace. - Load a neural network model file.
- Create input for the model.
- Create an inference engine (a worker).
- Run the model with the input to infer a result.
- Get the result.
Use the Unity.Sentis namespace
Add the following to the top of your script:
using Unity.Sentis;
Load a model
Sentis can import model files in Open Neural Network Exchange (ONNX) format. To load a model, follow these steps:
Export a model to ONNX format from a machine learning framework, or download an ONNX model from the internet.
Add the model file to the
Assets
folder of the Project window.Create a runtime model in your script.
ModelAsset modelAsset = Resources.Load("model-file-in-assets-folder") as ModelAsset; runtimeModel = ModelLoader.Load(modelAsset);
Refer to Import a model file for more information.
Create input for the model
Use the Tensor API to create a tensor with data for the model. You can convert an array or a texture to a tensor. For example:
// Convert a texture to a tensor
Texture2D inputTexture = Resources.Load("image-file") as Texture2D;
TensorFloat inputTensor = TextureConverter.ToTensor(inputTexture);
Refer to Create input for a model for more information.
Create an inference engine (a worker)
In Sentis, a worker is the inference engine. You create a worker to break down the model into executable tasks, run the tasks on the GPU or CPU, and output the result.
For example, the following creates a worker that runs on the GPU using Sentis compute shaders:
IWorker worker = WorkerFactory.CreateWorker(BackendType.GPUCompute, runtimeModel);
Refer to Create an engine for more information.
Run the model
To run the model, use the Execute
method of the worker object with the input tensor. For example:
worker.Execute(inputTensor);
Refer to Run a model for more information.
Get the output
You can use methods such as PeekOutput
to get the output data from the model. For example:
TensorFloat outputTensor = worker.PeekOutput() as TensorFloat;
Refer to Get output from a model for more information.
Example
The following example classifies a handwritten digit.
Follow these steps:
- Attach the following script to a GameObject in your scene.
using UnityEngine;
using Unity.Sentis;
using Unity.Sentis.Layers;
public class ClassifyHandwrittenDigit : MonoBehaviour
{
public Texture2D inputTexture;
public ModelAsset modelAsset;
Model runtimeModel;
IWorker worker;
public float[] results;
void Start()
{
// Create the runtime model
runtimeModel = ModelLoader.Load(modelAsset);
// Add softmax layer to end of model instead of non-softmaxed output
string softmaxOutputName = "Softmax_Output";
runtimeModel.AddLayer(new Softmax(softmaxOutputName, runtimeModel.outputs[0]));
runtimeModel.outputs[0] = softmaxOutputName;
// Create input data as a tensor
using Tensor inputTensor = TextureConverter.ToTensor(inputTexture, width: 28, height: 28, channels: 1);
// Create an engine
worker = WorkerFactory.CreateWorker(BackendType.GPUCompute, runtimeModel);
// Run the model with the input data
worker.Execute(inputTensor);
// Get the result
using TensorFloat outputTensor = worker.PeekOutput() as TensorFloat;
// Move the tensor data to the CPU before reading it
outputTensor.MakeReadable();
results = outputTensor.ToReadOnlyArray();
}
void OnDisable()
{
// Tell the GPU we're finished with the memory the engine used
worker.Dispose();
}
}
Download a handwriting recognition ONNX model file, for example the MNIST Handwritten Digit Recognition model mnist-8.onnx from the ONNX Model Zoo, and drag it into the
Assets
folder of the Project window.Drag the model asset into the modelAsset field in the Inspector window of the GameObject.
Download the
digit.png
image below and drag it into theAssets
folder of the Project window. SetNon-Power of 2
toNone
in the import settings and clickApply
.Drag the digit asset into the inputTexture field in the Inspector window of the GameObject.
Click
Play
. In the Inspector window of the GameObject, each item of the results array shows how highly the model predicts the image is a digit. For example, item 0 of the array is how highly the model predicts the image is a handwritten zero.