Post-processing is a way of applying effects to rendered images in Unity.
Any Unity script that uses the OnRenderImage function can act as a post-processing effect. Add it to a CameraA component which creates an image of a particular viewpoint in your scene. The output is either drawn to the screen or captured as a texture. More info
See in Glossary GameObjectThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
See in Glossary for the script to perform post-processing.
The OnRenderImage Unity Scripting API function receives two arguments:
The source image as a RenderTexture
The destination it should render into, which is a RenderTexture as well.
Post-processing effects often use ShadersA small script that contains the mathematical calculations and algorithms for calculating the Color of each pixel rendered, based on the lighting input and the Material configuration. More info
See in Glossary. These read the source image, do some calculations on it, and render the result into the destination (using Graphics.Blit](../ScriptReference/Graphics.Blit.html), for example). The post-processing effect fully replaces all the pixels of the destination.
Cameras can have multiple post-processing effects, each as componentsA functional part of a GameObject. A GameObject can contain any number of components. Unity has many built-in components, and you can create your own by writing scripts that inherit from MonoBehaviour. More info
See in Glossary. Unity executes them as a stack, in the order they are listed in the InspectorA Unity window that displays information about the currently selected GameObject, Asset or Project Settings, alowing you to inspect and edit the values. More info
See in Glossary with the post-processing component at the top of the Inspector rendered first. In this situation, the result of the first post-processing component is passed as the “source image” to the next post-processing component. Internally, Unity creates one or more temporary render texturesA special type of Texture that is created and updated at runtime. To use them, first create a new Render Texture and designate one of your Cameras to render into it. Then you can use the Render Texture in a Material just like a regular Texture. More info
See in Glossary to keep these intermediate results in.
Note that the list of post-processing components in the post-processing stack do not specify the order they are applied in.
Things to keep in mind:
The destination render texture can be null, which means “render to screen” (that is, the back buffer). This happens on the last post-processing effect on a Camera.
When OnRenderImage
finishes, Unity expects that the destination render texture is the active render target. Generally, a Graphics.Blit or manual renderingThe process of drawing graphics to the screen (or to a render texture). By default, the main camera in Unity renders its view to the screen. More info
See in Glossary into the destination texture should be the last rendering operation.
Turn off depth bufferA memory store that holds the z-value depth of each pixel in an image, where the z-value is the depth for each rendered pixel from the projection plane. More info
See in Glossary writes and tests in your post-processing effect shaders. This ensures that Graphics.Blit does not write unintended values into destination Z buffer. Almost all post-processing shader passes should contain Cull Off ZWrite Off ZTest Always
states.
To use stencil or depth buffer values from the original sceneA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
See in Glossary render, explicitly bind the depth buffer from the original scene render as your depth target, using Graphics.SetRenderTarget. Pass the very first source image effects depth buffer as the depth buffer to bind.
By default, Unity executes post-processing effects after it renders a whole Scene. In some cases, you may prefer Unity to render post-processing effects after it has rendered all opaque objects in your scene but before it renders others (for example, before skyboxA special type of Material used to represent skies. Usually six-sided. More info
See in Glossary or transparencies). Depth-based effects like Depth of FieldA post-processing effect that simulates the focus properties of a camera lens. More info
See in Glossary often use this.
To do this, add an ImageEffectOpaque attribute on the OnRenderImage Unity Scripting API function.
If a post-processing effect is sampling different screen-related textures at once, you might need to be aware of how different platforms use texture coordinates. A common scenario is that the effect “source” texture and camera’s depth texture need different vertical coordinates, depending on anti-aliasing settings. See the Unity User Manual Platform Differences page for more information.
Depth Textures are often used in image post-processing to get distance to closest opaque surface for each pixelThe smallest unit in a computer image. Pixel size depends on your screen resolution. Pixel lighting is calculated at every screen pixel. More info
See in Glossary on screen.
For HDRhigh dymanic range
See in Glossary rendering, a ImageEffectTransformsToLDR attribute indicates using tonemappingThe process of remapping HDR values of an image into a range suitable to be displayed on screen. More info
See in Glossary.
You can also use Command Buffers to perform post-processing.
Use RenderTexture.GetTemporary to get temporary render textures and do calculations inside a post-processing effect.
See also the Unity User Manual page on Writing Shader Programs.
2017–05–24 Page published with no editorial review
New feature in 5.6
Did you find this page useful? Please give it a rating:
Thanks for rating this page!
What kind of problem would you like to report?
Is something described here not working as you expect it to? It might be a Known Issue. Please check with the Issue Tracker at issuetracker.unity3d.com.
Thanks for letting us know! This page has been marked for review based on your feedback.
If you have time, you can provide more information to help us fix the problem faster.
Provide more information
You've told us this page needs code samples. If you'd like to help us further, you could provide a code sample, or tell us about what kind of code sample you'd like to see:
You've told us there are code samples on this page which don't work. If you know how to fix it, or have something better we could use instead, please let us know:
You've told us there is information missing from this page. Please tell us more about what's missing:
You've told us there is incorrect information on this page. If you know what we should change to make it correct, please tell us:
You've told us this page has unclear or confusing information. Please tell us more about what you found unclear or confusing, or let us know how we could make it clearer:
You've told us there is a spelling or grammar error on this page. Please tell us what's wrong:
You've told us this page has a problem. Please tell us more about what's wrong:
Thanks for helping to make the Unity documentation better!
When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.
More information
These cookies enable the website to provide enhanced functionality and personalisation. They may be set by us or by third party providers whose services we have added to our pages. If you do not allow these cookies then some or all of these services may not function properly.
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.
These cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising. Some 3rd party video providers do not allow video views without targeting cookies. If you are experiencing difficulty viewing a video, you will need to set your cookie preferences for targeting to yes if you wish to view videos from these providers. Unity does not control this.
These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.