URP, HDRP, ShaderGraph, Surface shadersA program that runs on the GPU. More info
See in Glossary, and built-in shaders already support single-pass stereo instanced rendering. However, shaders from the Asset StoreA growing library of free and commercial assets created by Unity and members of the community. Offers a wide variety of assets, from textures, models and animations to whole project examples, tutorials and Editor extensions. More info
See in Glossary, from other third parties, or those that you have written yourself might need to be updated.
For more information about supporting instanced rendering in your shaders, see GPU Instancing. The information in this section specifically talks about stereo rendering and might not include all changes you need to make to support instanced rendering in general.
Add the UNITY_VERTEX_INPUT_INSTANCE_ID
macro to the appdata
struct.
Example:
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
UNITY_VERTEX_INPUT_INSTANCE_ID //Insert
};
Add UNITY_VERTEX_OUTPUT_STEREO
macro to the v2f
output struct.
Example:
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
UNITY_VERTEX_OUTPUT_STEREO //Insert
};
Add the following macros to the beginning of your main vert
method (in order):
UNITY_SETUP_INSTANCE_ID()
UNITY_INITIALIZE_OUTPUT(v2f, o)
UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO()
UNITY_SETUP_INSTANCE_ID()
calculates and sets the built-in unity_StereoEyeIndex
and unity_InstanceID
shader variables to the correct values based on which eye the GPU is currently rendering.
UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO
tells the GPU which eye in the texture array it should render to, based on the value of unity_StereoEyeIndex
. This macro also transfers the value of unity_StereoEyeIndex
from the vertex shaderA program that runs on each vertex of a 3D model when the model is being rendered. More info
See in Glossary so that it will be accessible in the fragment shader only if UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX
is called in the fragment shader frag
method.
UNITY_INITALIZE_OUTPUT(v2f,o)
initializes all v2f
values to 0.
Example:
v2f vert (appdata v)
{
v2f o;
UNITY_SETUP_INSTANCE_ID(v); //Insert
UNITY_INITIALIZE_OUTPUT(v2f, o); //Insert
UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(o); //Insert
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = v.uv;
return o;
}
If you want your Post-ProcessingA process that improves product visuals by applying filters and effects before the image appears on screen. You can use post-processing effects to simulate physical camera and film properties, for example Bloom and Depth of Field. More info post processing, postprocessing, postprocess
See in Glossary shaders to support single-pass stereo instancing, follow the steps for custom shaders as well as the steps below.
Note: You can download all Unity base shader scriptsA piece of code that allows you to create your own Components, trigger game events, modify Component properties over time and respond to user input in any way you like. More info
See in Glossary from the Unity website.
Do the following for each Post-Processing shader that you want to support single-pass instancing:
Add the UNITY_DECLARE_SCREENSPACE_TEXTURE(tex) macro outside the frag method (see the example below for placement) in your Shader script, so that when you use a particular stereo rendering method the GPU uses the appropriate texture sampler. For example, if you use multi-pass rendering, the GPU uses a texture 2D sampler. For single-pass instancing or multi-view rendering, the texture sampler is a texture array.
Add UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(i)
at the beginning of the fragment shader frag method (See the example below for placement). You only need to add this macro if you want to use the unity_StereoEyeIndex
built-in shader variable to find out which eye the GPU is rendering to. This is useful when testing post-processing effects.
Use the UNITY_SAMPLE_SCREENSPACE_TEXTURE()
macro when sampling 2D textures (See the example below). Standard shaders use a 2D texture-based back buffer to sample textures. Single-pass stereo instancing does not use this type of back buffer, so if you do not specify a different method for 2D texture sampling, your shader does not render correctly. To prevent rendering issues, the UNITY_SAMPLE_SCREENSPACE_TEXTURE()
macro detects which stereo rendering pathThe technique that a render pipeline uses to render graphics. Choosing a different rendering path affects how lighting and shading are calculated. Some rendering paths are more suited to different platforms and hardware than others. More info
See in Glossary you are using and then automatically samples the texture in the correct manner. See Unity documentation on HLSLSupport.cginc to learn more about similar macros used for depth textures and screen-space shadow maps.
Example:
UNITY_DECLARE_SCREENSPACE_TEXTURE(_MainTex); //Insert
fixed4 frag (v2f i) : SV_Target
{
UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(i); //Insert
fixed4 col = UNITY_SAMPLE_SCREENSPACE_TEXTURE(_MainTex, i.uv); //Insert
// just invert the colors
col = 1 - col;
return col;
}
Below is a simple example of the template image effect shader with all of the previously mentioned changes applied to support single-pass stereo instancing. The lines added to the shader code are marked with the comment: //Insert
.
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
UNITY_VERTEX_INPUT_INSTANCE_ID //Insert
};
//v2f output struct
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
UNITY_VERTEX_OUTPUT_STEREO //Insert
};
v2f vert (appdata v)
{
v2f o;
UNITY_SETUP_INSTANCE_ID(v); //Insert
UNITY_INITIALIZE_OUTPUT(v2f, o); //Insert
UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(o); //Insert
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = v.uv;
return o;
}
UNITY_DECLARE_SCREENSPACE_TEXTURE(_MainTex); //Insert
fixed4 frag (v2f i) : SV_Target
{
UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(i); //Insert
fixed4 col = UNITY_SAMPLE_SCREENSPACE_TEXTURE(_MainTex, i.uv); //Insert
// invert the colors
col = 1 - col;
return col;
}
If you use the Graphics.DrawProceduralIndirect() and CommandBuffer.DrawProceduralIndirect() methods to draw fully procedural geometry on the GPU, note that both methods receive their arguments from a compute buffer. This means that it is difficult to increase the instance count at run time. To increase the instance count, you must manually double the instance count contained in your compute buffers.
The following shader code renders a GameObjectThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
See in Glossary as green for a user’s left eye and red for their right eye. This shader is useful for debugging your stereo rendering, because it allows you to verify that all stereo graphics work and are functioning correctly.
Shader "XR/StereoEyeIndexColor"
{
Properties
{
_LeftEyeColor("Left Eye Color", COLOR) = (0,1,0,1)
_RightEyeColor("Right Eye Color", COLOR) = (1,0,0,1)
}
SubShader
{
Tags { "RenderType" = "Opaque" }
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
float4 _LeftEyeColor;
float4 _RightEyeColor;
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
UNITY_VERTEX_INPUT_INSTANCE_ID
};
struct v2f
{
float4 vertex : SV_POSITION;
UNITY_VERTEX_INPUT_INSTANCE_ID
UNITY_VERTEX_OUTPUT_STEREO
};
v2f vert (appdata v)
{
v2f o;
UNITY_SETUP_INSTANCE_ID(v);
UNITY_INITIALIZE_OUTPUT(v2f, o);
UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(o);
o.vertex = UnityObjectToClipPos(v.vertex);
return o;
}
fixed4 frag (v2f i) : SV_Target
{
UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(i);
return lerp(_LeftEyeColor, _RightEyeColor, unity_StereoEyeIndex);
}
ENDCG
}
}
}
ShaderGraph automatically adds the macros required to support single-pass stereo rendering. To implement the debug shader in ShaderGraph you can use a Custom Function Node that sets the base color based on the eye index.
Use the unity_StereoEyeIndex
shader attribute to determine the base color depending on which eye instance is being rendered. The Custom Function Node in the example above contains the following code:
Out = lerp(LeftColor, RightColor, unity_StereoEyeIndex);
Did you find this page useful? Please give it a rating:
Thanks for rating this page!
What kind of problem would you like to report?
Thanks for letting us know! This page has been marked for review based on your feedback.
If you have time, you can provide more information to help us fix the problem faster.
Provide more information
You've told us this page needs code samples. If you'd like to help us further, you could provide a code sample, or tell us about what kind of code sample you'd like to see:
You've told us there are code samples on this page which don't work. If you know how to fix it, or have something better we could use instead, please let us know:
You've told us there is information missing from this page. Please tell us more about what's missing:
You've told us there is incorrect information on this page. If you know what we should change to make it correct, please tell us:
You've told us this page has unclear or confusing information. Please tell us more about what you found unclear or confusing, or let us know how we could make it clearer:
You've told us there is a spelling or grammar error on this page. Please tell us what's wrong:
You've told us this page has a problem. Please tell us more about what's wrong:
Thank you for helping to make the Unity documentation better!
Your feedback has been submitted as a ticket for our documentation team to review.
We are not able to reply to every ticket submitted.
When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.
More information
These cookies enable the website to provide enhanced functionality and personalisation. They may be set by us or by third party providers whose services we have added to our pages. If you do not allow these cookies then some or all of these services may not function properly.
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.
These cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising. Some 3rd party video providers do not allow video views without targeting cookies. If you are experiencing difficulty viewing a video, you will need to set your cookie preferences for targeting to yes if you wish to view videos from these providers. Unity does not control this.
These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.