This page contains information on using CommandBuffers in the Built-in Render Pipeline. For information on using CommandBuffers in render pipelines based on the Scriptable Rendering Pipeline, see Scheduling and executing rendering commands in the Scriptable Rendering Pipeline.
A CommandBuffer holds a list of rendering commands (such as setting the render target, or drawing a given mesh). You can instruct Unity to schedule and execute those commands at various points in the Built-in Render Pipeline, which allows you to customize and extend Unity’s rendering functionality.
You can execute CommandBuffers immediately using the Graphics.ExecuteCommandBuffer API, or you can schedule them so that they occur at a given point in the render pipeline. To schedule them, use the Camera.AddCommandBuffer API with the CameraEvent enum, and the Light.AddCommandBuffer API with the LightEvent enum. To see when Unity executes CommandBuffers that you schedule in this way, see CameraEvent and LightEvent order of execution.
For a full list of the commands that you can execute using CommandBuffers, see the CommandBuffer API documentation. Note that some commands are supported only on certain hardware; for example, the commands relating to ray tracing are supported only in DX12.
The Unity blog post Extending Unity 5 rendering pipeline: Command Buffers introduces CommandBuffers in the Built-in Render Pipeline. It describes how to use CommandBuffers to achieve several different effects, and contains a sample project and example code. The project was created for an older version of Unity, but the principles are the same.
The order of execution for CameraEvents depends on the rendering path that your Project uses.
Deferred rendering path
Forward rendering path
During the “render shadows” stage above, for each shadow-casting Light, Unity performs these steps: