Version: 2018.4
Compute shaders
GPU instancing

Extending the Built-in Render Pipeline with Command Buffers

A Command Buffer holds a list of renderingThe process of drawing graphics to the screen (or to a render texture). By default, the main camera in Unity renders its view to the screen. More info
See in Glossary
commands. You can instruct Unity to schedule and execute those commands.

This page contains information on using Command Buffers in the Built-in Render Pipeline. For information on using Command Buffers in the Scriptable Rendering Pipeline, see the ScriptableRenderContext API documentation.

For a full list of the commands that you can execute using Command Buffers, see the CommandBuffer API documentation.

Executing Command Buffers immediately

You can execute Command Buffers immediately using the Graphics.ExecuteCommandBuffer API.

Scheduling Command Buffers

In the Built-in Render Pipeline, you can execute Command Buffers at specific times in the render loop; to do this, use the Camera.AddCommandBuffer API with the CameraEvent enum, and the Light.AddCommandBuffer API with the LightEvent enum.

For example, you can use a Command Buffer with the AfterGBuffer CameraEvent to render additional GameObjectsThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
See in Glossary
into the Deferred pipeline, after the pipeline processes all opaque GameObjects.

CameraEvent order of execution

The order of execution depends on the rendering pathThe technique Unity uses to render graphics. Choosing a different path affects the performance of your game, and how lighting and shading are calculated. Some paths are more suited to different platforms and hardware than others. More info
See in Glossary
that your Project uses.

Deferred rendering path
BeforeGBuffer
Unity renders opaque geometry.
AfterGBuffer
Unity resolves depth.
BeforeReflections
Unity renders default reflections.
Unity renders Reflection ProbeA rendering component that captures a spherical view of its surroundings in all directions, rather like a camera. The captured image is then stored as a Cubemap that can be used by objects with reflective materials. More info
See in Glossary
reflections.
AfterReflections
Unity copies reflections to the Emissive channel of the G-buffer.
BeforeLighting
Unity renders shadows. See LightEvent order of execution.
AfterLighting
BeforeFinalPass
Unity processes the final pass.
AfterFinalPass
BeforeForwardOpaque (only called if there is opaque geometry that cannot be rendered using deferred)
Unity renders opaque geometry that cannot be rendered with deferred rendering.
AfterForwardOpaque (only called if there is opaque geometry that cannot be rendered using deferred)
BeforeSkybox
Unity renders the skyboxA special type of Material used to represent skies. Usually six-sided. More info
See in Glossary
.
AfterSkybox
Unity renders halos.
BeforeImageEffectsOpaque
If you are using the Post-processing Stack V2 package, Unity applies opaque-only post-processingA process that improves product visuals by applying filters and effects before the image appears on screen. You can use post-processing effects to simulate physical camera and film properties, for example Bloom and Depth of Field. More info post processing, postprocessing, postprocess
See in Glossary
effects.
AfterImageEffectsOpaque
BeforeForwardAlpha
Unity renders transparent geometry, and UI(User Interface) Allows a user to interact with your application. More info
See in Glossary
Canvases with a Rendering ModeA Standard Shader Material parameter that allows you to choose whether the object uses transparency, and if so, which type of blending mode to use. More info
See in Glossary
of Screen Space - CameraA component which creates an image of a particular viewpoint in your scene. The output is either drawn to the screen or captured as a texture. More info
See in Glossary
.
AfterForwardAlpha
BeforeHaloAndLensFlares
Unity renders lens flaresA component that simulates the effect of lights refracting inside a camera lens. Use a Lens Flare to represent very bright lights or add atmosphere to your scene. More info
See in Glossary
.
AfterHaloAndLensFlares
BeforeImageEffects
If you are using the Post-processing Stack V2 package, Unity applies post-processing effects.
AfterImageEffects
AfterEverything
Unity renders UI Canvases with a Rendering Mode that is not Screen Space - Camera.

Forward rendering path
BeforeDepthTexture
Unity renders depth for opaque geometry.
AfterDepthTexture
BeforeDepthNormalsTexture
Unity renders depth normals for opaque geometry.
AfterDepthNormalsTexture
Unity renders shadows. See LightEvent order of execution.
BeforeForwardOpaque
Unity renders opaque geometry.
AfterForwardOpaque
BeforeSkybox
Unity renders the skybox.
AfterSkybox
Unity renders halos.
BeforeImageEffectsOpaque
If you are using the Post-processing Stack V2 package, Unity applies opaque-only post-processing effects.
AfterImageEffectsOpaque
BeforeForwardAlpha
Unity renders transparent geometry, and UI Canvases with a Rendering Mode of Screen Space - Camera.
AfterForwardAlpha
BeforeHaloAndLensFlares
Unity renders lens flares.
AfterHaloAndLensFlares
BeforeImageEffects
If you are using the Post-processing Stack V2 package, Unity applies post-processing effects.
AfterImageEffects
AfterEverything
Unity renders UI Canvases with a Rendering Mode of Screen Space - Camera.

LightEvent order of execution

During the “render shadows” stage above, for each shadow-casting Light, Unity performs these steps:

BeforeShadowMap
BeforeShadowMapPass

Unity renders all shadow casters for the current pass

AfterShadowMapPass


Repeat for each pass
AfterShadowMap
BeforeScreenSpaceMask
Unity gathers the shadow map into a screen space buffer and performs filtering
AfterScreenSpaceMask

Additional resources

Compute shaders
GPU instancing