Legacy Documentation: Version 2018.2 (Go to current version)
Monitors
High Dynamic Range Rendering
Other Versions

Writing post-processing effects

Post-processing is a way of applying effects to rendered images in Unity.

Any Unity script that uses the OnRenderImage function can act as a post-processing effect. Add it to a CameraA component which creates an image of a particular viewpoint in your scene. The output is either drawn to the screen or captured as a texture. More info
See in Glossary
GameObjectThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
See in Glossary
for the script to perform post-processing.

OnRenderImage function

The OnRenderImage Unity Scripting API function receives two arguments:

  • The source image as a RenderTexture

  • The destination it should render into, which is a RenderTexture as well.

Post-processing effects often use ShadersA small script that contains the mathematical calculations and algorithms for calculating the Color of each pixel rendered, based on the lighting input and the Material configuration. More info
See in Glossary
. These read the source image, do some calculations on it, and render the result into the destination (using Graphics.Blit](../ScriptReference/Graphics.Blit.html), for example). The post-processing effect fully replaces all the pixels of the destination.

Cameras can have multiple post-processing effects, each as componentsA functional part of a GameObject. A GameObject can contain any number of components. Unity has many built-in components, and you can create your own by writing scripts that inherit from MonoBehaviour. More info
See in Glossary
. Unity executes them as a stack, in the order they are listed in the InspectorA Unity window that displays information about the currently selected GameObject, Asset or Project Settings, alowing you to inspect and edit the values. More info
See in Glossary
with the post-processing component at the top of the Inspector rendered first. In this situation, the result of the first post-processing component is passed as the “source image” to the next post-processing component. Internally, Unity creates one or more temporary render texturesA special type of Texture that is created and updated at runtime. To use them, first create a new Render Texture and designate one of your Cameras to render into it. Then you can use the Render Texture in a Material just like a regular Texture. More info
See in Glossary
to keep these intermediate results in.

Note that the list of post-processing components in the post-processing stack do not specify the order they are applied in.

Things to keep in mind:

  • The destination render texture can be null, which means “render to screen” (that is, the back buffer). This happens on the last post-processing effect on a Camera.

  • When OnRenderImage finishes, Unity expects that the destination render texture is the active render target. Generally, a Graphics.Blit or manual renderingThe process of drawing graphics to the screen (or to a render texture). By default, the main camera in Unity renders its view to the screen. More info
    See in Glossary
    into the destination texture should be the last rendering operation.

  • Turn off depth bufferA memory store that holds the z-value depth of each pixel in an image, where the z-value is the depth for each rendered pixel from the projection plane. More info
    See in Glossary
    writes and tests in your post-processing effect shaders. This ensures that Graphics.Blit does not write unintended values into destination Z buffer. Almost all post-processing shader passes should contain Cull Off ZWrite Off ZTest Always states.

  • To use stencil or depth buffer values from the original sceneA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
    See in Glossary
    render, explicitly bind the depth buffer from the original scene render as your depth target, using Graphics.SetRenderTarget. Pass the very first source image effects depth buffer as the depth buffer to bind.

After opaque post-processing effects

By default, Unity executes post-processing effects after it renders a whole Scene. In some cases, you may prefer Unity to render post-processing effects after it has rendered all opaque objects in your scene but before it renders others (for example, before skyboxA special type of Material used to represent skies. Usually six-sided. More info
See in Glossary
or transparencies). Depth-based effects like Depth of FieldA post-processing effect that simulates the focus properties of a camera lens. More info
See in Glossary
often use this.

To do this, add an ImageEffectOpaque attribute on the OnRenderImage Unity Scripting API function.

Texture coordinates on different platforms

If a post-processing effect is sampling different screen-related textures at once, you might need to be aware of how different platforms use texture coordinates. A common scenario is that the effect “source” texture and camera’s depth texture need different vertical coordinates, depending on anti-aliasing settings. See the Unity User Manual Platform Differences page for more information.

Related topics

  • Depth Textures are often used in image post-processing to get distance to closest opaque surface for each pixelThe smallest unit in a computer image. Pixel size depends on your screen resolution. Pixel lighting is calculated at every screen pixel. More info
    See in Glossary
    on screen.

  • For HDRhigh dymanic range
    See in Glossary
    rendering, a ImageEffectTransformsToLDR attribute indicates using tonemappingThe process of remapping HDR values of an image into a range suitable to be displayed on screen. More info
    See in Glossary
    .

  • You can also use Command Buffers to perform post-processing.

  • Use RenderTexture.GetTemporary to get temporary render textures and do calculations inside a post-processing effect.

  • See also the Unity User Manual page on Writing Shader Programs.


Did you find this page useful? Please give it a rating:

Monitors
High Dynamic Range Rendering