Vertex and Fragment Shader Examples

Scene Shader Setup

We start on how to setup a basic scene in Unity to work with. The initial examples are going to use a plane and a camera pointing at the plane to show the different functionality of the fragment shaders. The setup of the plane and the camera is explained, moreover the shaders can work with any geometry without major modifications.

Configuring a Unity Project

After starting Unity, you will probably be viewing an empty project. If not, you should create a new project by choosing File > New Project... from the menu. If you are not familiar with Unity's Scene View, Hierarchy View, Project View and Inspector View, now would be a good time to read the first two (or three) sections (Unity Basics and Building Scenes) of the Unity User Guide.

Creating a Mesh

The first step in this scene setup is to create a plane. Click on Create > Plane in the menu Hierarchy View, or try out one of the other methods to do this in Unity. Then position the camera so it can show the plane. Double-click the Main Camera Object from the Hierarchy view and get a Scene View of the plane. Then select the Align with View option from the menu GameObject. Make sure the plane does not cover the entire screen so you can see the difference between rendering with a shader and without it. you should now see a gray sphere in the Preview section of the Inspector View for the material.

Creating a Material

Create a new Material by clicking Create in the Project View and choosing Material. A new material called New Material will appear in the Project View. Change the name of the Material to something more descriptive like Material-ShaderLab.

Creating a Shader

Create a Shader in the same way as a Material, but selecting Shader in the menu. Change the name of the Shader to something more descriptive like Shader-ShaderLab.

Shader "Custom/NewShader" {
	Properties {
		_MainTex ("Base (RGB)", 2D) = "white" {}
	}
	SubShader {
		Tags { "RenderType" = "Opaque" }
		LOD 200

		CGPROGRAM
		#pragma surface surf Lambert

		sampler2D _MainTex;

		struct Input {
			float2 uv_MainTex;
		};

		void surf (Input IN, inout SurfaceOutput o) {
			half4 c = tex2D (_MainTex, IN.uv_MainTex);
			o.Albedo = c.rgb;
			o.Alpha = c.a;
		}
		ENDCG
	}
	FallBack "Diffuse"
}

The default surface shader is generated after the creation of a new shader. The text can be replaced by a new version of a vertex-fragment shader.

Linking the Mesh, Material and Shader

Link the Shader with the Material in the inspector windows of the material. You can drag and drop the shader in the Project View over the material, or you can select the material in the Project View and then choose the shader from the drop-down list labeled Material-ShaderLab in the Inspector View. In either case, you should now see a gray sphere in the Preview section of the Inspector View for the material. If it doesn't show and an error message is displayed at the bottom of the Unity window, reopen the shader and check in the editor that the text is the same as that given in this lesson.

Link the new material to a triangle mesh. To do this attach the material to the new plane by dragging and dropping the material from the Project View to the plane in the Hierarchy View. Alternatively you can drag and drop the material from the Project View to the plane in the Scene View. Another way would be to select the plane in the Hierarchy View, locate the Mesh Renderer component in the Inspector View (open it by clicking the title if it isn't open) and open the Materials setting of the Mesh Renderer by clicking it. Use whichever of these methods that works best for you. Change the Default-Diffuse material to the new material by clicking the dotted circle icon to the right of the material name and choosing the new material from the pop-up window. The plane in the Scene View should now have the same color as the preview in the Inspector View of the material. Changing the shader should (after saving and switching to Unity) change the appearance of the plane in the Scene View.

Now you have the scene setup completed.

Empty Shader

This will explain the four main parts of a Shader in ShaderLab that you can see in the code.

Shader "Custom/Empty" {
	SubShader {
		Pass {
			CGPROGRAM
			ENDCG
		}
	}
}
Shader
The Shader command contains a string with the name of the Shader. This name can be subdivided with the "/" character simulating a folder structure, simplifying the future reutilization of the Shader. The name should be unique in all shaders.
SubShader
A Shader can contain one or more SubShaders, which are primarily used to implement shaders for different GPU capabilities. All the SubShaders should present similar results using different techniques for each architecture. ShaderLab translates the code of the shader automatically to other architectures, but in some cases full functionality of the Shader in mobile architectures is not desired or some inputs are missing, You are going to work with a simple SubShader in the Shader and the lessons are implemented for desktop architectures.
Pass
Each SubShader is composed by a number of Passes, and each Pass represents an execution of the Vertex and Fragment code for the same object rendered with the Material of the Shader. You should implement the shader with the smallest number of Passes possible for performance reasons.
CGPROGRAM - ENDCG
These directives define in ShaderLab the language used for the Shader. Unity can work with the shader languages of Cg and GLSL. the Cg language is recommended, because several optimization steps are implemented for the different architectures.

Simple Shader

The difference between vertex and fragment shaders is the process developed in the render pipeline. Vertex shader could be define as the shader programs that modifies the geometry of the scene and made the 3D projection. Fragment shaders are related to the render window and defines the color for each pixel. Fragment and Vertex shader can present more functionality in new graphics cards like moving the vertex position or storing more data for each pixel.

Shader "Custom/SolidColor" {
	SubShader {
		Pass {
			CGPROGRAM

			#pragma vertex vert
			#pragma fragment frag

			float4 vert(float4 v:POSITION) : SV_POSITION {
				return mul (UNITY_MATRIX_MVP, v);
			}

			fixed4 frag() : COLOR {
				return fixed4(1.0,0.0,0.0,1.0);
			}

			ENDCG
		}
	}
}
#pragma vertex vert
Vertex Shader is a shader program to modify the geometry of the scene. It is executed for each vertex in the scene, and outputs are the coordinates of the projection, color, textures and other data passed to the fragment shader. The directive #pragma vertex [function name] is used to define the name of the vertex function.
#pragma fragment frag
Fragment Shader is a shader program to modify image properties in the render window. It is executed for each pixel and the output is the color info of the pixel. The directive #pragma fragment define the name of the function of the fragment shader.
float4 vert(float4 v:POSITION) : SV_POSITION
The variable v contains the values for the vertex, this function is called for each vertex of the scene rendered. The returned value is a float4 variable with the position of the vertex in the screen, the use of four float values to store a SV_POSITION coordinate in a 2D window is because the geometry of the projection is expected in homogeneous coordinates which use two values for the pixel in the screen (x,y), one value for the depth (z), and one value for the homogeneous space (w). You can transform homogeneous coordinates into 3D coordinates, but you will lose the ability to obtain the 3D position of the vertex in the future; also you can delete the z value, but you will lose the ability to perform depth testing in the final rendering. The homogeneous coordinates are used in almost all GPUs. Eliminating variables of the vertex shader output shaders doesn't represent any optimization of the code, instead making the code less generic and more difficult to understand.
return mul (UNITY_MATRIX_MVP, v.vertex);
This is the core of the vertex function, where a coordinate in 3D is projected into a 2D window. The process involves multiplying the 3D position with a matrix know as Model-View-Projection
fixed4 frag() : COLOR
This line defines the function frag, which is going to be used by the fragment shader as its main function. In this lesson, there are no input parameters for the fragment function, and the output is defined as a COLOR value defined in a fixed4 variable that contains the RGBA (red, green, blue, alpha) of the color. It's possible to use the color as a visualization of a float variable, or even to pass information between the vertex and fragment shaders. However the COLOR variable its clamped between the values 0 and 1 so Its expected a change of the values in the COLOR variable.
return fixed4(1.0,0.0,0.0,1.0);
Fragment function core functionality, where for each pixel processed in the fragment shader a color red is defined.

Shader Solid Color

Window Coordinates

This Shader defines a static color depending on the pixel coordinate of the window. This allows the definition of colors in the screen independent of the geometry rendered. If you move the plane you can see the colors don't change on the screen, only the visibility depending on the position of the geometry.

Shader "Custom/WindowCoordinates/Base" {
	SubShader {
		Pass {
			CGPROGRAM

			#pragma vertex vert
			#pragma fragment frag
			#pragma target 3.0

			#include "UnityCG.cginc"

			float4 vert(appdata_base v) : POSITION {
				return mul (UNITY_MATRIX_MVP, v.vertex);
			}

			fixed4 frag(float4 sp:WPOS) : COLOR {
				return fixed4(sp.xy/_ScreenParams.xy,0.0,1.0);
			}

			ENDCG
		}
	}
}
#pragma target 3.0
The target directive defines the hardware requirements to support the shader.
#include "UnityCG.cginc"
Include the file "UnityCG.cginc" inside the code of the shader.
appdata_base v
Defines a default structure of Unity to set the input variables of the vertex shaders.
WPOS
Input argument for the frag function, which is declared as a float4 type and WPOS semantics. Semantics is an special clause of Cg to define the default input values of a fragment/vertex Shader. WPOS fills the variable with the values of the pixel coordinates in the screen of the rendered shader. This position needs to be divided by the screen size, the values are transformed into 0 and 1 values so they can be mapped as the RG components of the output color.
ScreenParams
The screen width and height values are stored in the variable _ScreenParams; this variable is declared in the UnityCG.cginc file.

Shader Windows Coordinates

Examples

Window coordinates are not commonly used in shaders, because normally the shaders work with the position of the texture as you will see in the next lessons. However some interesting effects like Render with a Mask or some other image effects use this property.

Behind bars

Shader "Custom/WindowCoordinates/Bars" {
	SubShader {
		Pass {
			CGPROGRAM
			#pragma vertex vert
			#pragma fragment frag

			#include "UnityCG.cginc"

			struct vertOut {
				float4 pos:SV_POSITION;
				float4 scrPos;
			};

			vertOut vert(appdata_base v) {
				vertOut o;
				o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
				o.scrPos = ComputeScreenPos(o.pos);
				return o;
			}

			fixed4 frag(vertOut i) : COLOR0 {
				float2 wcoord = (i.scrPos.xy/i.scrPos.w);
				fixed4 color;

				if (fmod(20.0*wcoord.x,2.0)<1.0) {
					color = fixed4(wcoord.xy,0.0,1.0);
				} else {
					color = fixed4(0.3,0.3,0.3,1.0);
				}
				return color;
			}

			ENDCG
		}
	}
}
struct vertOut
Defines a new type of variable, this is a compound type, which represent a set of other variables with native or compound types. The struct is used when more than a native data is needed to be pass from the vertex to the fragment shader. As the return data of a function is only a variable, with struct a set of values can be represented as an only value.
ComputeScreenPos
Funtion define in the UnityCG.cginc file, this functions return the screen position for the fragment shader. The difference with the previous example were a WPOS semantic variable was used, this function is multiplatform and it does not need target 3.0.

Shader Windows Coordinates Bars

Vignetting

Shader "Custom/WindowCoordinates/Vignetting" {
	SubShader {
		Pass {
			CGPROGRAM

			#pragma vertex vert
			#pragma fragment frag
			#pragma target 3.0

			#include "UnityCG.cginc"

			float4 vert(appdata_base v) : POSITION {
				return mul (UNITY_MATRIX_MVP, v.vertex);
			}

			float4 frag(float4 sp:WPOS) : COLOR {
				float2 wcoord = sp.xy/_ScreenParams.xy;
				float vig = clamp(3.0*length(wcoord-0.5),0.0,1.0);
				return lerp (float4(wcoord,0.0,1.0),float4(0.3,0.3,0.3,1.0),vig);
			}
			ENDCG
		}
	}
}

Shader Windows Coordinates Vignetting
lerp
Standard function of the Cg language, this function in particular is a linear interpolation between the first two arguments given the third as a factor. Many functions are implemented in the Cg language as standar functions.

Circles Mask

Some shaders need external information given in types like Variables, Arrays, Textures. These cases are cover by the use of uniforms and properties in ShaderLab. Properties define a set of type which can be passed to the shader in every run. The variables can modify the behavior of the shader, the following example show how to change the values of a shader mask full of circles, The number of rows and columns can be change also the size of the circles.

Shader "Custom/WindowCoordinates/CirclesMask" {
	Properties {
		_CirclesX ("Circles in X", Float) = 20
		_CirclesY ("Circles in Y", Float) = 10
		_Fade ("Fade", Range (0.1,1.0)) = 0.5
	}
	SubShader {
		Pass {

			CGPROGRAM

			#pragma vertex vert
			#pragma fragment frag
			#pragma target 3.0

			#include "UnityCG.cginc"

			uniform float _CirclesX;
			uniform float _CirclesY;
			uniform float _Fade;

			float4 vert(appdata_base v) : POSITION {
				return mul (UNITY_MATRIX_MVP, v.vertex);
			}

			float4 frag(float4 sp:WPOS) : COLOR {
				float2 wcoord = sp.xy/_ScreenParams.xy;
				float4 color;
				if (length(fmod(float2(_CirclesX*wcoord.x,_CirclesY*wcoord.y),2.0)-1.0)<_Fade) {
					color = float4(sp.xy/_ScreenParams.xy,0.0,1.0);
				} else {
					color = float4(0.3,0.3,0.3,1.0);
				} 
				return color;
			}
			ENDCG
		}
	}
}
uniform
ShaderLab define properties as values that are transfered from the application to the shader. Properties are defined with different types like Color, Float, 2D, and others. To use the properties values defined into the shader the keyword uniform should be use declaring the variables, so the compiler identify that variables as external values which should be transferred to the shader in each execution.

Shader Windows Coordinates Circles Mask

Texture Coordinates

This lesson presents a set of shaders which use texture coordinates to draw the faces of a mesh with different effects.

Texture position in a mesh is given by the variable texcoord0, which is set for each vertex. The vertex and fragment shader can modify this variable. You should select in which shader you modify texcoord0, depending of the variables needed for the operation, the number of vertex in the scene and the screen resolution. You should select the modification of texcoord0 values looking for the smallest number of executions.

Shader "Custom/TextureCoordinates/Base" {
	SubShader {
		Pass {
			CGPROGRAM
			#pragma vertex vert
			#pragma fragment frag
			#pragma target 3.0

			#include "UnityCG.cginc"

			struct vertexInput {
				float4 vertex : POSITION;
				float4 texcoord0 : TEXCOORD0;
			};

			struct fragmentInput{
				float4 position : SV_POSITION;
				float4 texcoord0 : TEXCOORD0;
			};

			fragmentInput vert(vertexInput i){
				fragmentInput o;
				o.position = mul (UNITY_MATRIX_MVP, i.vertex);
				o.texcoord0 = i.texcoord0;
				return o;
			}
			float4 frag(fragmentInput i) : COLOR {
				return float4(i.texcoord0.xy,0.0,1.0);
			}
			ENDCG
		}
	}
}

The range of the values of texcoord0 is from 0 to 1 and are mapped to a specific texture. Textures are images store in the memory of the GPU representing a square or rectangular image, the textures in the fragment shader can be also used as procedural textures which uses mathematical functions to define the color of each pixel in the screen. In the code above you the texcoord0 values where used to set the colors of the Red and Green channels, given a gradient effect for each color.


Shader Texture Coordinates

Examples

Chess

Shader "Custom/TextureCoordinates/Chess" {
	SubShader {
		Pass {
			CGPROGRAM
			#pragma vertex vert
			#pragma fragment frag

			#include "UnityCG.cginc"

			struct vertexInput {
				float4 vertex : POSITION;
				float4 texcoord0 : TEXCOORD0;
			};

			struct fragmentInput{
				float4 position : SV_POSITION;
				float4 texcoord0 : TEXCOORD0;
			};

			fragmentInput vert(vertexInput i){
				fragmentInput o;
				o.position = mul (UNITY_MATRIX_MVP, i.vertex);
				o.texcoord0 = i.texcoord0;
				return o;
			}

			float4 frag(fragmentInput i) : COLOR {
				float4 color;
				if ( fmod(i.texcoord0.x*8.0,2.0) < 1.0  ){
					if ( fmod(i.texcoord0.y*8.0,2.0) < 1.0 )
					{
						color = float4(1.0,1.0,1.0,1.0);
					} else {
						color = float4(0.0,0.0,0.0,1.0);
					}
				} else {
					if ( fmod(i.texcoord0.y*8.0,2.0) > 1.0 )
					{
						color = float4(1.0,1.0,1.0,1.0);
					} else {
						color = float4(0.0,0.0,0.0,1.0);}
					}
				return color;
			}
			ENDCG
		}
	}
}
Shader "Custom/TextureCoordinates/ChessOpt" {
	SubShader {
		Pass {
			CGPROGRAM
			#pragma vertex vert_img
			#pragma fragment frag

			#include "UnityCG.cginc"

			float4 frag(v2f_img i) : COLOR {
				bool p = fmod(i.uv.x*8.0,2.0) < 1.0;
				bool q = fmod(i.uv.y*8.0,2.0) > 1.0;

				return float4(float3((p && q) || !(p || q)),1.0);
			}
			ENDCG
		}
	}
}

Shader Texture Coordinates Chess

Mandelbrot Fractal

Shader "Custom/TextureCoordinates/Mandelbrot" {

	SubShader {
		Pass {
			CGPROGRAM
			#pragma vertex vert_img
			#pragma fragment frag
			#pragma target 3.0

			#include "UnityCG.cginc"

			float4 frag(v2f_img i) : COLOR {
				float2 mcoord;
				float2 coord = float2(0.0,0.0);
				mcoord.x = ((1.0-i.uv.x)*3.5)-2.5;
				mcoord.y = (i.uv.y*2.0)-1.0;
				float iteration = 0.0;
				const float _MaxIter = 29.0;
				const float PI = 3.14159;
				float xtemp;
				for ( iteration = 0.0; iteration < _MaxIter; iteration += 1.0) {
					if ( coord.x*coord.x + coord.y*coord.y > 2.0*(cos(fmod(_Time.y,2.0*PI))+1.0) )
					break;
					xtemp = coord.x*coord.x - coord.y*coord.y + mcoord.x;
					coord.y = 2.0*coord.x*coord.y + mcoord.y;
					coord.x = xtemp;
				}
				float val = fmod((iteration/_MaxIter)+_Time.x,1.0);
				float4 color;

				color.r = clamp((3.0*abs(fmod(2.0*val,1.0)-0.5)),0.0,1.0);
				color.g = clamp((3.0*abs(fmod(2.0*val+(1.0/3.0),1.0)-0.5)),0.0,1.0);
				color.b = clamp((3.0*abs(fmod(2.0*val-(1.0/3.0),1.0)-0.5)),0.0,1.0);
				color.a = 1.0;

				return color;
			}
			ENDCG
		}
	}
}

Shader Texture Coordinates Mandelbrot

Texture

Shader "Custom/TextureCoordinates/Lenna" {
	Properties {
		_MainTex ("Base (RGB)", 2D) = "white" {}
	}
	SubShader {
		Pass {
			CGPROGRAM
			#pragma vertex vert_img
			#pragma fragment frag

			#include "UnityCG.cginc"

			uniform sampler2D _MainTex;

			float4 frag(v2f_img i) : COLOR {
				return tex2D(_MainTex, i.uv);
			}
			ENDCG
		}
	}
}

Shader Texture Coordinates Lenna

Page last updated: 2013-07-19