Some rendering effects require rendering a scene with a different set of shaders. For example, good edge detection would need a texture with scene normals, so it could detect edges where surface orientations differ. Other effects might need a texture with scene depth, and so on. To achieve this, it is possible to render the scene with replaced shaders of all objects.
Shader replacement is done from scripting using Camera.RenderWithShader or Camera.SetReplacementShader functions. Both functions take a shader and a replacementTag.
It works like this: the camera renders the scene as it normally would. the objects still use their materials, but the actual shader that ends up being used is changed:
So if all shaders would have, for example, a "RenderType" tag with values like "Opaque", "Transparent", "Background", "Overlay", you could write a replacement shader that only renders solid objects by using one subshader with RenderType=Solid tag. The other tag types would not be found in the replacement shader, so the objects would be not rendered. Or you could write several subshaders for different "RenderType" tag values. Incidentally, all built-in Unity shaders have a "RenderType" tag set.
All built-in Unity shaders have a "RenderType" tag set that can be used when rendering with replaced shaders. Tag values are the following:
A Camera has a built-in capability to render depth or depth+normals texture, if you need that in some of your effects. See Camera Depth Texture page. Note that in some cases (depending on the hardware), the depth and depth+normals textures can internally be rendered using shader replacement. So it is important to have the correct "RenderType" tag in your shaders.
Page last updated: 2012-06-21