Version: 2019.3
Shader Reference
Surface Shaders and rendering paths

Writing Surface Shaders

Writing shadersA small script that contains the mathematical calculations and algorithms for calculating the Color of each pixel rendered, based on the lighting input and the Material configuration. More info
See in Glossary
that interact with lighting is complex. There are different light types, different shadow options, different renderingThe process of drawing graphics to the screen (or to a render texture). By default, the main camera in Unity renders its view to the screen. More info
See in Glossary
paths (forward and deferred rendering), and the shader should somehow handle all that complexity.

Surface Shaders in Unity is a code generation approach that makes it much easier to write lit shaders than using low level vertex/pixel shader programs. Note that there are no custom languages, magic or ninjas involved in Surface Shaders; it just generates all the repetitive code that would have to be written by hand. You still write shader code in HLSL.

For some examples, take a look at Surface Shader Examples and Surface Shader Custom Lighting Examples.

How it works

You define a “surface function” that takes any UVs or data you need as input, and fills in output structure SurfaceOutput. SurfaceOutput basically describes properties of the surface (it’s albedo color, normal, emission, specularity etc.). You write this code in HLSL.

Surface Shader compiler then figures out what inputs are needed, what outputs are filled and so on, and generates actual vertex&pixel shaders, as well as rendering passes to handle forward and deferred rendering.

Standard output structure of surface shaders is this:

struct SurfaceOutput
{
    fixed3 Albedo;  // diffuse color
    fixed3 Normal;  // tangent space normal, if written
    fixed3 Emission;
    half Specular;  // specular power in 0..1 range
    fixed Gloss;    // specular intensity
    fixed Alpha;    // alpha for transparencies
};

In Unity 5, surface shaders can also use physically based lighting models. Built-in Standard and StandardSpecular lighting models (see below) use these output structures respectively:

struct SurfaceOutputStandard
{
    fixed3 Albedo;      // base (diffuse or specular) color
    fixed3 Normal;      // tangent space normal, if written
    half3 Emission;
    half Metallic;      // 0=non-metal, 1=metal
    half Smoothness;    // 0=rough, 1=smooth
    half Occlusion;     // occlusion (default 1)
    fixed Alpha;        // alpha for transparencies
};
struct SurfaceOutputStandardSpecular
{
    fixed3 Albedo;      // diffuse color
    fixed3 Specular;    // specular color
    fixed3 Normal;      // tangent space normal, if written
    half3 Emission;
    half Smoothness;    // 0=rough, 1=smooth
    half Occlusion;     // occlusion (default 1)
    fixed Alpha;        // alpha for transparencies
};

Samples

See Surface Shader Examples, Surface Shader Custom Lighting Examples and Surface Shader Tessellation pages.

Surface Shader compile directives

Surface shader is placed inside CGPROGRAM..ENDCG block, just like any other shader. The differences are:

  • It must be placed inside SubShaderEach shader in Unity consists of a list of subshaders. When Unity has to display a mesh, it will find the shader to use, and pick the first subshader that runs on the user’s graphics card. More info
    See in Glossary
    block, not inside Pass. Surface shader will compile into multiple passes itself.
  • It uses #pragma surface ... directive to indicate it’s a surface shader.

The #pragma surface directive is:

#pragma surface surfaceFunction lightModel [optionalparams]

Required parameters

  • surfaceFunction - which Cg function has surface shader code. The function should have the form of void surf (Input IN, inout SurfaceOutput o), where Input is a structure you have defined. Input should contain any texture coordinates and extra automatic variables needed by surface function.
  • lightModel - lighting model to use. Built-in ones are physically based Standard and StandardSpecular, as well as simple non-physically based Lambert (diffuse) and BlinnPhong (specular). See Custom Lighting Models page for how to write your own.
    • Standard lighting model uses SurfaceOutputStandard output struct, and matches the Standard (metallic workflow) shader in Unity.
    • StandardSpecular lighting model uses SurfaceOutputStandardSpecular output struct, and matches the Standard (specular setup) shader in Unity.
    • Lambert and BlinnPhong lighting models are not physically based (coming from Unity 4.x), but the shaders using them can be faster to render on low-end hardware.

Optional parameters

Transparency and alpha testing is controlled by alpha and alphatest directives. Transparency can typically be of two kinds: traditional alpha blending (used for fading objects out) or more physically plausible “premultiplied blending” (which allows semitransparent surfaces to retain proper specular reflections). Enabling semitransparency makes the generated surface shader code contain blending commands; whereas enabling alpha cutout will do a fragment discard in the generated pixelThe smallest unit in a computer image. Pixel size depends on your screen resolution. Pixel lighting is calculated at every screen pixel. More info
See in Glossary
shader, based on the given variable.

  • alpha or alpha:auto - Will pick fade-transparency (same as alpha:fade) for simple lighting functions, and premultiplied transparency (same as alpha:premul) for physically based lighting functions.
  • alpha:blend - Enable alpha blending.
  • alpha:fade - Enable traditional fade-transparency.
  • alpha:premul - Enable premultiplied alpha transparency.
  • alphatest:VariableName - Enable alpha cutout transparency. Cutoff value is in a float variable with VariableName. You’ll likely also want to use addshadow directive to generate proper shadow caster pass.
  • keepalpha - By default opaque surface shaders write 1.0 (white) into alpha channel, no matter what’s output in the Alpha of output struct or what’s returned by the lighting function. Using this option allows keeping lighting function’s alpha value even for opaque surface shaders.
  • decal:add - Additive decal shader (e.g. terrain AddPass). This is meant for objects that lie atop of other surfaces, and use additive blending. See Surface Shader Examples
  • decal:blend - Semitransparent decal shader. This is meant for objects that lie atop of other surfaces, and use alpha blending. See Surface Shader Examples

Custom modifier functions can be used to alter or compute incoming vertex data, or to alter final computed fragment color.

  • vertex:VertexFunction - Custom vertex modification function. This function is invoked at start of generated vertex shaderA program that runs on each vertex of a 3D model when the model is being rendered. More info
    See in Glossary
    , and can modify or compute per-vertex data. See Surface Shader Examples.
  • finalcolor:ColorFunction - Custom final color modification function. See Surface Shader Examples.
  • finalgbuffer:ColorFunction - Custom deferred path for altering gbuffer content.
  • finalprepass:ColorFunction - Custom prepass base path.

Shadows and Tessellation - additional directives can be given to control how shadows and tessellation is handled.

  • addshadow - Generate a shadow caster pass. Commonly used with custom vertex modification, so that shadow casting also gets any procedural vertex animation. Often shaders don’t need any special shadows handling, as they can just use shadow caster pass from their fallback.
  • fullforwardshadows - Support all light shadow types in Forward rendering pathThe technique Unity uses to render graphics. Choosing a different path affects the performance of your game, and how lighting and shading are calculated. Some paths are more suited to different platforms and hardware than others. More info
    See in Glossary
    . By default shaders only support shadows from one directional light in forward renderingA rendering path that renders each object in one or more passes, depending on lights that affect the object. Lights themselves are also treated differently by Forward Rendering, depending on their settings and intensity. More info
    See in Glossary
    (to save on internal shader variant count). If you need point or spot light shadows in forward rendering, use this directive.
  • tessellate:TessFunction - use DX11 GPU tessellation; the function computes tessellation factors. See Surface Shader Tessellation for details.

Code generation options - by default generated surface shader code tries to handle all possible lighting/shadowing/lightmapA pre-rendered texture that contains the effects of light sources on static objects in the scene. Lightmaps are overlaid on top of scene geometry to create the effect of lighting. More info
See in Glossary
scenarios. However in some cases you know you won’t need some of them, and it is possible to adjust generated code to skip them. This can result in smaller shaders that are faster to load.

  • exclude_path:deferred, exclude_path:forward, exclude_path:prepass - Do not generate passes for given rendering path (Deferred ShadingA rendering path in the Built-in Render Pipeline that places no limit on the number of Lights that can affect a GameObject. All Lights are evaluated per-pixel, which means that they all interact correctly with normal maps and so on. Additionally, all Lights can have cookies and shadows. More info
    See in Glossary
    , Forward and Legacy Deferred respectively).
  • noshadow - Disables all shadow receiving support in this shader.
  • noambient - Do not apply any ambient lighting or light probesLight probes store information about how light passes through space in your scene. A collection of light probes arranged within a given space can improve lighting on moving objects and static LOD scenery within that space. More info
    See in Glossary
    .
  • novertexlights - Do not apply any light probes or per-vertex lights in Forward rendering.
  • nolightmap - Disables all lightmapping support in this shader.
  • nodynlightmap - Disables runtime dynamic global illuminationA group of techniques that model both direct and indirect lighting to provide realistic lighting results. Unity has two global illumination systems that combine direct and indirect lighting.: Baked Global Illumination, and Realtime Global Illumination (deprecated).
    See in Glossary
    support in this shader.
  • nodirlightmap - Disables directional lightmaps support in this shader.
  • nofog - Disables all built-in Fog support.
  • nometa - Does not generate a “meta” pass (that’s used by lightmapping & dynamic global illumination to extract surface information).
  • noforwardadd - Disables Forward rendering additive pass. This makes the shader support one full directional light, with all other lights computed per-vertex/SH. Makes shaders smaller as well.
  • nolppv - Disables Light Probe Proxy VolumeA component that allows you to use more lighting information for large dynamic GameObjects that cannot use baked lightmaps (for example, large Particle Systems or skinned Meshes). More info
    See in Glossary
    support in this shader.
  • noshadowmask - Disables Shadowmask support for this shader (both ShadowmaskA Texture that shares the same UV layout and resolution with its corresponding lightmap. More info
    See in Glossary
    and Distance ShadowmaskA version of the Shadowmask lighting mode that includes high quality shadows cast from static GameObjects onto dynamic GameObjects. More info
    See in Glossary
    ).

Miscellaneous options

  • softvegetation - Makes the surface shader only be rendered when Soft Vegetation is on.
  • interpolateview - Compute view direction in the vertex shader and interpolate it; instead of computing it in the pixel shader. This can make the pixel shader faster, but uses up one more texture interpolator.
  • halfasview - Pass half-direction vector into the lighting function instead of view-direction. Half-direction will be computed and normalized per vertex. This is faster, but not entirely correct.
  • approxview - Removed in Unity 5.0. Use interpolateview instead.
  • dualforward - Use dual lightmaps in forward rendering path.
  • dithercrossfade - Makes the surface shader support dithering effects. You can then apply this shader to GameObjectsThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
    See in Glossary
    that use an LOD GroupA component to manage level of detail (LOD) for GameObjects. More info
    See in Glossary
    component configured for cross-fade transition mode.

To see what exactly is different from using different options above, it can be helpful to use “Show Generated Code” button in the Shader Inspector.

Surface Shader input structure

The input structure Input generally has any texture coordinates needed by the shader. Texture coordinates must be named “uv” followed by texture name (or start it with “uv2” to use second texture coordinate set).

Additional values that can be put into Input structure:

  • float3 viewDir - contains view direction, for computing Parallax effects, rim lighting etc.
  • float4 with COLOR semantic - contains interpolated per-vertex color.
  • float4 screenPos - contains screen space position for reflection or screenspace effects. Note that this is not suitable for GrabPass; you need to compute custom UV yourself using ComputeGrabScreenPos function.
  • float3 worldPos - contains world space position.
  • float3 worldRefl - contains world reflection vector if surface shader does not write to o.Normal. See Reflect-Diffuse shader for example.
  • float3 worldNormal - contains world normal vector if surface shader does not write to o.Normal.
  • float3 worldRefl; INTERNAL_DATA - contains world reflection vector if surface shader writes to o.Normal. To get the reflection vector based on per-pixel normal mapA type of Bump Map texture that allows you to add surface detail such as bumps, grooves, and scratches to a model which catch the light as if they are represented by real geometry. More info
    See in Glossary
    , use WorldReflectionVector (IN, o.Normal). See Reflect-Bumped shader for example.
  • float3 worldNormal; INTERNAL_DATA - contains world normal vector if surface shader writes to o.Normal. To get the normal vector based on per-pixel normal map, use WorldNormalVector (IN, o.Normal).

Surface Shaders and DirectX 11 HLSL syntax

Currently some parts of surface shader compilation pipeline do not understand DirectX 11-specific HLSL syntax, so if you’re using HLSL features like StructuredBuffers, RWTextures and other non-DX9 syntax, you have to wrap it into a DX11-only preprocessor macro.

See Platform Specific Differences and Shading Language pages for details.


  • 2017–06–08 Page published

  • noshadowmask added in 5.6

Shader Reference
Surface Shaders and rendering paths