Native Plug-insA platform-specific native code library that is created outside of Unity for use in Unity. Allows you can access features like OS calls and third-party code libraries that would otherwise not be available to Unity. More info
See in Glossary in Unity can receive callbacks when certain events happen. You can use this to implement low-level rendering in your plug-inA set of code created outside of Unity that creates functionality in Unity. There are two kinds of plug-ins you can use in Unity: Managed plug-ins (managed .NET assemblies created with tools like Visual Studio) and Native plug-ins (platform-specific native code libraries). More info
See in Glossary so it can work with Unity’s multithreaded rendering.
To handle main Unity events, a plug-in must export UnityPluginLoad
and UnityPluginUnload
functions. IUnityInterfaces enables the plug-in to access these functions, which you can find in IUnityInterface.h
in the plug-in API:
#include "IUnityInterface.h"
#include "IUnityGraphics.h"
// Unity plugin load event
extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API
UnityPluginLoad(IUnityInterfaces* unityInterfaces)
{
IUnityGraphics* graphics = unityInterfaces->Get<IUnityGraphics>();
}
Use the IUnityGraphics
interface, which you can find in IUnityGraphics.h
, to give a plug-in access to generic graphics device functionality. This script demonstrates how you can use the IUnityGraphics
interface to register a callback:
#include "IUnityInterface.h"
#include "IUnityGraphics.h"
static IUnityInterfaces* s_UnityInterfaces = NULL;
static IUnityGraphics* s_Graphics = NULL;
static UnityGfxRenderer s_RendererType = kUnityGfxRendererNull;
// Unity plugin load event
extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API
UnityPluginLoad(IUnityInterfaces* unityInterfaces)
{
s_UnityInterfaces = unityInterfaces;
s_Graphics = unityInterfaces->Get<IUnityGraphics>();
s_Graphics->RegisterDeviceEventCallback(OnGraphicsDeviceEvent);
// Run OnGraphicsDeviceEvent(initialize) manually on plugin load
// to not miss the event in case the graphics device is already initialized
OnGraphicsDeviceEvent(kUnityGfxDeviceEventInitialize);
}
// Unity plugin unload event
extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API
UnityPluginUnload()
{
s_Graphics->UnregisterDeviceEventCallback(OnGraphicsDeviceEvent);
}
static void UNITY_INTERFACE_API
OnGraphicsDeviceEvent(UnityGfxDeviceEventType eventType)
{
switch (eventType)
{
case kUnityGfxDeviceEventInitialize:
{
s_RendererType = s_Graphics->GetRenderer();
//TODO: user initialization code on graphics device initialization.
For example, D3D11 resource creation.
break;
}
case kUnityGfxDeviceEventShutdown:
{
s_RendererType = kUnityGfxRendererNull;
//TODO: user graphics API code to call on graphics device shutdown.
break;
}
case kUnityGfxDeviceEventBeforeReset:
{
//TODO: user graphics API code to call before graphics device reset.
break;
}
case kUnityGfxDeviceEventAfterReset:
{
//TODO: user graphics API code to call after graphics device reset.
break;
}
};
}
You can use multithreading to render in Unity, if the platform and number of available CPUs allows for it.
Note: When you use multithreaded rendering, the rendering API commands happen on a thread that’s completely separate from the thread that runs MonoBehaviour scriptsA piece of code that allows you to create your own Components, trigger game events, modify Component properties over time and respond to user input in any way you like. More info
See in Glossary. The communication between the main thread and the render thread means that your plug-in might not start rendering immediately, depending on how much work the main thread has pushed to the render thread.
To render from the plug-in, call GL.IssuePluginEvent from your managed plug-inA managed .NET assembly that is created with tools like Visual Studio for use in Unity. More info
See in Glossary script. This causes Unity’s rendering pipeline to call the native function from the render thread, as demonstrated in the code example below. For example, if you call GL.IssuePluginEvent from the CameraA component which creates an image of a particular viewpoint in your scene. The output is either drawn to the screen or captured as a texture. More info
See in Glossary’s OnPostRender function, the function will call a plug-in callback immediately after the camera has finished rendering.
Native plugin code:
// Plugin function to handle a specific rendering event
static void UNITY_INTERFACE_API OnRenderEvent(int eventID)
{
// User rendering code
}
// Freely defined function to pass a callback to plugin-specific scripts
extern "C" UnityRenderingEvent UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API
GetRenderEventFunc()
{
return OnRenderEvent;
}
Managed plug-in code:
#if UNITY_IPHONE && !UNITY_EDITOR
[DllImport ("__Internal")]
#else
[DllImport("RenderingPlugin")]
#endif
private static extern IntPtr GetRenderEventFunc();
// Queue a specific callback to be called on the render thread
GL.IssuePluginEvent(GetRenderEventFunc(), 1);
The signature for the UnityRenderingEvent
callback is provided in IUnityGraphics.h in the Native Rendering Plugin sample.
There are two kinds of OpenGL objects:
Unity uses multiple OpenGL contexts. When initializing and closing the Editor and the Player, Unity relies on a master context, but when rendering it uses dedicated contexts. That is, you can’t create per-context objects during kUnityGfxDeviceEventInitialize
and kUnityGfxDeviceEventShutdown
events.