This tutorial will teach you the basics of how to write vertex and fragment programs in Unity shadersA small script that contains the mathematical calculations and algorithms for calculating the Color of each pixel rendered, based on the lighting input and the Material configuration. More info
See in Glossary. For a basic introduction to ShaderLabUnity’s declarative language for writing shaders. More info
See in Glossary see the Getting Started tutorial. If you want to write shaders that interact with lighting, read about Surface ShadersUnity’s code generation approach that makes it much easier to write lit shaders than using low level vertex/pixel shader programs. More info
See in Glossary instead.
Lets start with a small recap of the general structure of a shader:
Shader "MyShaderName"
{
Properties
{
// material properties here
}
SubShader // subshader for graphics hardware A
{
Pass
{
// pass commands ...
}
// more passes if needed
}
// more subshaders if needed
FallBack "VertexLit" // optional fallback
}
Here at the end we introduce a new command: FallBack “VertexLit”. The Fallback command can be used at the end of the shader; it tells which shader should be used if no SubShadersEach shader in Unity consists of a list of subshaders. When Unity has to display a mesh, it will find the shader to use, and pick the first subshader that runs on the user’s graphics card. More info
See in Glossary from the current shader can run on user’s graphics hardware. The effect is the same as including all SubShaders from the fallback shader at the end. For example, if you were to write a fancy normal-mapped shader, then instead of writing a very basic non-normal-mapped subshader for old graphics cards you can just fallback to built-in VertexLit shader.
The basic building blocks of the shader are introduced in the first shader tutorial while the full documentation of PropertiesA generic term for the editable fields, buttons, checkboxes, or menus that comprise a component. An editable property is also referred to as a field. More info
See in Glossary, SubShaders and Passes are also available.
A quick way of building SubShaders is to use passes defined in other shaders. The command UsePass does just that, so you can reuse shader code in a neat fashion. As an example the following command uses the pass with the name “FORWARD” from the built-in Specular shader: UsePass “Specular/FORWARD”.
In order for UsePass to work, a name must be given to the pass one wishes to use. The Name command inside the pass gives it a name: Name “MyPassName”.
We described a pass that used just a single texture combine instruction in the first tutorial. Now it is time to demonstrate how we can use vertex and fragment programs in our pass.
When you use vertex and fragment programs (the so called “programmable pipeline”), most of the hardcoded functionality (“fixed function pipeline”) in the graphics hardware is switched off. For example, using a vertex program turns off standard 3D transformations, lighting and texture coordinate generation completely. Similarly, using a fragment program replaces any texture combine modes that would be defined in SetTexture commands; thus SetTexture commands are not needed.
Writing vertex/fragment programs requires a thorough knowledge of 3D transformations, lighting and coordinate spaces - because you have to rewrite the fixed functionality that is built into APIs like OpenGL yourself. On the other hand, you can do much more than what’s built in!
Shaders in ShaderLab are usually written in Cg/HLSL programming language. Cg and DX9-style HLSL are for all practical purposes one and the same language, so we’ll be using Cg and HLSL interchangeably (see this page for details).
Shader code is written by embedding “Cg/HLSL snippets” in the shader text. Snippets are compiled into low-level shader assembly by the Unity editor, and the final shader that is included in your game’s data files only contains this low-level assembly or bytecode, that is platform specific. When you select a shader in the Project ViewA view that shows the contents of your Assets folder (Project tab) More info
See in Glossary, the InspectorA Unity window that displays information about the currently selected GameObject, Asset or Project Settings, allowing you to inspect and edit the values. More info
See in Glossary has a button to show compiled shader code, which might help as a debugging aid. Unity automatically compiles Cg snippets for all relevant platforms (Direct3D 9, OpenGL, Direct3D 11, OpenGL ES and so on). Note that because Cg/HLSL code is compiled by the editor, you can’t create shaders from scriptsA piece of code that allows you to create your own Components, trigger game events, modify Component properties over time and respond to user input in any way you like. More info
See in Glossary at runtime.
In general, snippets are placed inside Pass blocks. They look like this:
Pass {
// ... the usual pass state setup ...
CGPROGRAM
// compilation directives for this snippet, e.g.:
#pragma vertex vert
#pragma fragment frag
// the Cg/HLSL code itself
ENDCG
// ... the rest of pass setup ...
}
The following example demonstrates a complete shader that renders object normals as colors:
Shader "Tutorial/Display Normals" {
SubShader {
Pass {
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct v2f {
float4 pos : SV_POSITION;
fixed3 color : COLOR0;
};
v2f vert (appdata_base v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
o.color = v.normal * 0.5 + 0.5;
return o;
}
fixed4 frag (v2f i) : SV_Target
{
return fixed4 (i.color, 1);
}
ENDCG
}
}
}
When applied on an object it will result in an image like this:
Our “Display Normals” shader does not have any properties, contains a single SubShader with a single Pass that is empty except for the Cg/HLSL code. Let’s dissect the code part by part:
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
// ...
ENDCG
The whole snippet is written between CGPROGRAMA block of shader code for controlling shaders using NVIDIA’s Cg (C for graphics) programming language. More info
See in Glossary and ENDCG keywords. At the start compilation directives are given as #pragma statements:
Following the compilation directives is just plain Cg/HLSL code. We start by including a built-in include file:
#include "UnityCG.cginc"
The UnityCG.cginc file contains commonly used declarations and functions so that the shaders can be kept smaller (see shader include files page for details). Here we’ll use appdata_base structure from that file. We could just define them directly in the shader and not include the file of course.
Next we define a “vertex to fragment” structure (here named v2f) - what information is passed from the vertex to the fragment program. We pass the position and color parameters. The color will be computed in the vertex program and just output in the fragment program.
We proceed by defining the vertex program - vert function. Here we compute the position and output input normal as a color: o.color = v.normal * 0.5 + 0.5;
Normal components are in –1..1 range, while colors are in 0..1 range, so we scale and bias the normal in the code above. Next we define a fragment program - frag function that just outputs the calculated color and 1 as the alpha component:
fixed4 frag (v2f i) : SV_Target
{
return fixed4 (i.color, 1);
}
That’s it, our shader is finished! Even this simple shader is very useful to visualize meshThe main graphics primitive of Unity. Meshes make up a large part of your 3D worlds. Unity supports triangulated or Quadrangulated polygon meshes. Nurbs, Nurms, Subdiv surfaces must be converted to polygons. More info
See in Glossary normals.
Of course, this shader does not respond to lights at all, and that’s where things get a bit more interesting; read about Surface Shaders for details.
When you define properties in the shader, you give them a name like _Color or _MainTex. To use them in Cg/HLSL you just have to define a variable of a matching name and type. See properties in shader programs page for details.
Here is a complete shader that displays a texture modulated by a color. Of course, you could easily do the same in a texture combiner call, but the point here is just to show how to use properties in Cg:
Shader "Tutorial/Textured Colored" {
Properties {
_Color ("Main Color", Color) = (1,1,1,0.5)
_MainTex ("Texture", 2D) = "white" { }
}
SubShader {
Pass {
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
fixed4 _Color;
sampler2D _MainTex;
struct v2f {
float4 pos : SV_POSITION;
float2 uv : TEXCOORD0;
};
float4 _MainTex_ST;
v2f vert (appdata_base v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
o.uv = TRANSFORM_TEX (v.texcoord, _MainTex);
return o;
}
fixed4 frag (v2f i) : SV_Target
{
fixed4 texcol = tex2D (_MainTex, i.uv);
return texcol * _Color;
}
ENDCG
}
}
}
The structure of this shader is the same as in the previous example. Here we define two properties, namely _Color and _MainTex. Inside Cg/HLSL code we define corresponding variables:
fixed4 _Color;
sampler2D _MainTex;
See Accessing Shader Properties in Cg/HLSL for more information.
The vertex and fragment programs here don’t do anything fancy; vertex program uses the TRANSFORM_TEX macro from UnityCG.cginc to make sure texture scale and offset is applied correctly, and fragment program just samples the texture and multiplies by the color property.
We have shown how custom shader programs can be written in a few easy steps. While the examples shown here are very simple, there’s nothing preventing you to write arbitrarily complex shader programs! This can help you to take the full advantage of Unity and achieve optimal renderingThe process of drawing graphics to the screen (or to a render texture). By default, the main camera in Unity renders its view to the screen. More info
See in Glossary results.
The complete ShaderLab reference manual is here, and more examples in vertex and fragment shader examples page. We also have a forum for shaders at forum.unity3d.com so go there to get help with your shaders! Happy programming, and enjoy the power of Unity and ShaderLab.