This page describes how to upgrade from an older version of the Universal Render PipelineA series of operations that take the contents of a Scene, and displays them on a screen. Unity lets you choose from pre-built render pipelines, or write your own. More info
See in Glossary (URP) to URP 16 (Unity 2023.2).
For information on converting assets made for a Built-in Render Pipeline project to assets compatible with URP, refer to the page Render Pipeline Converter.
When you create a custom Volume component class that overrides the VolumeComponent.Override(VolumeComponent state, float interpFactor)
method, your implementation must set the VolumeParameter.overrideState
property to true
whenever the VolumeParameter
value is changed. This ensures that the Volume framework resets the parameters to their correct default values. This lets the framework to use fewer resources every frame which improves performance.
SHADER_QUALITY_LOW/MEDIUM/HIGH
and SHADER_HINT_NICE_QUALITY
shaderA program that runs on the GPU. More info
See in Glossary defines were removed. If you used those defines in custom shaders, consider using SHADER_API_MOBILE
or SHADER_API_GLES
defines to replace SHADER_QUALITY_LOW/MEDIUM/HIGH
.
Unity now issues an error when instances of ScriptableRendererFeature
attempt to access render targets before they are allocated by the ScriptableRenderer
class.
The ScriptableRendererFeature
class has a new virtual function SetupRenderPasses
which is called when render targets are allocated and ready to be used.
If your code uses the ScriptableRenderer.cameraColorTarget
or the ScriptableRenderer.cameraDepthTarget
property inside of the AddRenderPasses
method override, you should move that implementation to the ScriptableRendererFeature.SetupRenderPasses
method.
The calls to the ScriptableRenderer.EnqueuePass
method should still happen in the AddRenderPasses
method.
The following example shows how to change the code to use the new API.
Code with the old API:
public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData) { // The target is used before allocation m_CustomPass.Setup(renderer.cameraColorTarget); // Letting the renderer know which passes are used before allocation renderer.EnqueuePass(m_ScriptablePass); }
Code with the new API:
public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData) { // Letting the renderer know which passes are used before allocation renderer.EnqueuePass(m_ScriptablePass); } public override void SetupRenderPasses(ScriptableRenderer renderer, in RenderingData renderingData) { // The target is used after allocation m_CustomPass.Setup(renderer.cameraColorTarget); }
The Universal Renderer is now using the RTHandle system for its internal targets and in its internal passes.
All usages of the RenderTargetHandle
struct are set as obsolete and the struct will be removed in the future.
The public interfaces ScriptableRenderer.cameraColorTarget
and ScriptableRenderer.cameraDepthTarget
are marked as obsolete. Replace them with ScriptableRenderer.cameraColorTargetHandle
and ScriptableRenderer.cameraDepthTargetHandle
respectively.
RTHandle
targets do not use the CommandBuffer.GetTemporaryRT
method and persist for more frames than the RenderTargetIdentifier
structs. You cannot allocate RTHandle
targets with the properties GraphicsFormat
and DepthBufferBits
set to any value except for 0. The cameraDepthTarget
properties must be separate from the cameraColorTarget
properties.
The following helper functions let you create and use temporary render target with the RTHandle
system in a similar way as with the GetTemporaryRT
method previously:
RenderingUtils.ReAllocateIfNeeded
ShadowUtils.ShadowRTReAllocateIfNeeded
If the render target does not change within the lifetime of the application, use the RTHandles.Alloc
method to allocate an RTHandle
target. This method is efficient since the code does not have to check if a render target should be allocated on each frame.
If the render target is a full screen texture, which means that its resolution matches or is a fraction of the resolution of the screen, use a scaling factor such as Vector2D.one
to support dynamic scaling.
The following example shows how to change the code using the RenderTargetHandle
API to use the new API.
Code with the old API:
public class CustomPass : ScriptableRenderPass { RenderTargetHandle m_Handle; // With the old API, RenderTargetIdentifier might combine color and depth RenderTargetIdentifier m_Destination; public CustomPass() { m_Handle.Init("_CustomPassHandle"); } public override void OnCameraSetup(CommandBuffer cmd, ref RenderingData renderingData) { var desc = renderingData.cameraData.cameraTargetDescriptor; cmd.GetTemporaryRT(m_Handle.id, desc, FilterMode.Point); } public override void OnCameraCleanup(CommandBuffer cmd) { cmd.ReleaseTemporaryRT(m_Handle.id); } public void Setup(RenderTargetIdentifier destination) { m_Destination = destination; } public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData) { CommandBuffer cmd = CommandBufferPool.Get(); // Set the same target for color and depth ScriptableRenderer.SetRenderTarget(cmd, m_Destination, m_Destination, clearFlag, clearColor); context.ExecuteCommandBuffer(cmd); CommandBufferPool.Release(cmd); } }
Code with the new API:
public class CustomPass : ScriptableRenderPass { RTHandle m_Handle; // Then using RTHandles, the color and the depth properties must be separate RTHandle m_DestinationColor; RTHandle m_DestinationDepth; void Dispose() { m_Handle?.Release(); } public override void OnCameraSetup(CommandBuffer cmd, ref RenderingData renderingData) { var desc = renderingData.cameraData.cameraTargetDescriptor; // Then using RTHandles, the color and the depth properties must be separate desc.depthBufferBits = 0; RenderingUtils.ReAllocateIfNeeded(ref m_Handle, desc, FilterMode.Point, TextureWrapMode.Clamp, name: "_CustomPassHandle"); } public override void OnCameraCleanup(CommandBuffer cmd) { m_DestinationColor = null; m_DestinationDepth = null; } public void Setup(RTHandle destinationColor, RTHandle destinationDepth) { m_DestinationColor = destinationColor; m_DestinationDepth = destinationDepth; } public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData) { CommandBuffer cmd = CommandBufferPool.Get(); ScriptableRenderer.SetRenderTarget(cmd, m_DestinationColor, m_DestinationDepth, clearFlag, clearColor); context.ExecuteCommandBuffer(cmd); CommandBufferPool.Release(cmd); } }
The Forward Renderer asset is renamed to the Universal Renderer asset. When you open an existing project in the Unity Editor containing URP 12, Unity updates the existing Forward Renderer assets to Universal Renderer assets.
The Universal Renderer asset contains the property Rendering PathThe technique that a render pipeline uses to render graphics. Choosing a different rendering path affects how lighting and shading are calculated. Some rendering paths are more suited to different platforms and hardware than others. More info
See in Glossary that lets you select the Forward or the Deferred Rendering Path.
The method ClearFlag.Depth
does not implicitly clear the Stencil bufferA memory store that holds an 8-bit per-pixel value. In Unity, you can use a stencil buffer to flag pixels, and then only render to pixels that pass the stencil operation. More info
See in Glossary anymore. Use the new method ClearFlag.Stencil
.
URP 12 and later implements the Render Pipeline Converter feature. This feature replaces the asset upgrade functions that were previously available at Edit > Render Pipeline > Universal Render Pipeline > Upgrade…
The file names of the following Shader Graph shaders were renamed. The new file names do not have spaces:Autodesk Interactive
Autodesk Interactive Masked
Autodesk Interactive Transparent
If your code uses the Shader.Find()
method to search for the shaders, remove spaces from the shader names, for example, Shader.Find("AutodeskInteractive)
.
Starting from version 10.0.x, URP can generate a normal texture called _CameraNormalsTexture
. To render to this texture in your custom shader, add a Pass with the name DepthNormals
. For an example, check the implementation in Lit.shader
.
URP 10.0.x implements the Screen Space Ambient OcclusionA method to approximate how much ambient light (light not coming from a specific direction) can hit a point on a surface.
See in Glossary (SSAO) effect.
If you intend to use the SSAO effect with your custom shaders, consider the following entities related to SSAO:
The _SCREEN_SPACE_OCCLUSION
keyword.
Input.hlsl
contains the new declaration float2 normalizedScreenSpaceUV
in the InputData
struct.
Lighting.hlsl
contains the AmbientOcclusionFactor
struct with the variables for calculating indirect and direct occlusion:
struct AmbientOcclusionFactor { half indirectAmbientOcclusion; half directAmbientOcclusion; };
Lighting.hlsl
contains the following function for sampling the SSAO texture:
half SampleAmbientOcclusion(float2 normalizedScreenSpaceUV)
Lighting.hlsl
contains the following function:
AmbientOcclusionFactor GetScreenSpaceAmbientOcclusion(float2 normalizedScreenSpaceUV)
To support SSAO in custom shader, add the DepthNormals
Pass and the _SCREEN_SPACE_OCCLUSION
keyword the the shader. For an example, check Lit.shader
.
If your custom shader implements custom lighting functions, use the function GetScreenSpaceAmbientOcclusion(float2 normalizedScreenSpaceUV)
to get the AmbientOcclusionFactor
value for your lighting calculations.
In 11.0.x the formula used to apply Shadow Normal Bias has been slightly fix in order to work better with punctual lights.
As a result, to match exactly shadow outlines from earlier revisions, the parameter might to be adjusted in some scenesA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
See in Glossary. Typically, using 1.4 instead of 1.0 for a Directional light is usually enough.
In previous URP versions, URP performed the rendering via an intermediate Renderer if the Renderer had any active Renderer Features. On some platforms, this had significant performance implications. In this release, URP mitigates the issue in the following way: URP expects Renderer Features to declare their inputs using the ScriptableRenderPass.ConfigureInput
method. The method provides the information that URP uses to determine automatically whether rendering via an intermediate texture is necessary.
For compatibility purpose, there is a new property Intermediate Texture in the Universal Renderer. If you select Always in the property, URP uses an intermediate texture. Selecting Auto enables the new behavior. Use the Always option only if a Renderer Feature does not declare its inputs using the ScriptableRenderPass.ConfigureInput
method.
To ensure that existing projects work correctly, all existing Universal Renderer assets that were using any Renderer Features (excluding those included with URP) have the option Always selected in the Intermediate Texture property. Any newly created Universal Renderer assets have the option Auto selected.
Upgrade to URP 7.2.0 first. Refer to Upgrading to version 7.2.0 of the Universal Render Pipeline.
URP 8.x.x does not support the package Post-Processing Stack v2. If your Project uses the package Post-Processing Stack v2, migrate the effects that use that package first.
Did you find this page useful? Please give it a rating:
Thanks for rating this page!
What kind of problem would you like to report?
Thanks for letting us know! This page has been marked for review based on your feedback.
If you have time, you can provide more information to help us fix the problem faster.
Provide more information
You've told us this page needs code samples. If you'd like to help us further, you could provide a code sample, or tell us about what kind of code sample you'd like to see:
You've told us there are code samples on this page which don't work. If you know how to fix it, or have something better we could use instead, please let us know:
You've told us there is information missing from this page. Please tell us more about what's missing:
You've told us there is incorrect information on this page. If you know what we should change to make it correct, please tell us:
You've told us this page has unclear or confusing information. Please tell us more about what you found unclear or confusing, or let us know how we could make it clearer:
You've told us there is a spelling or grammar error on this page. Please tell us what's wrong:
You've told us this page has a problem. Please tell us more about what's wrong:
Thank you for helping to make the Unity documentation better!
Your feedback has been submitted as a ticket for our documentation team to review.
We are not able to reply to every ticket submitted.
When you visit any website, it may store or retrieve information on your browser, mostly in the form of cookies. This information might be about you, your preferences or your device and is mostly used to make the site work as you expect it to. The information does not usually directly identify you, but it can give you a more personalized web experience. Because we respect your right to privacy, you can choose not to allow some types of cookies. Click on the different category headings to find out more and change our default settings. However, blocking some types of cookies may impact your experience of the site and the services we are able to offer.
More information
These cookies enable the website to provide enhanced functionality and personalisation. They may be set by us or by third party providers whose services we have added to our pages. If you do not allow these cookies then some or all of these services may not function properly.
These cookies allow us to count visits and traffic sources so we can measure and improve the performance of our site. They help us to know which pages are the most and least popular and see how visitors move around the site. All information these cookies collect is aggregated and therefore anonymous. If you do not allow these cookies we will not know when you have visited our site, and will not be able to monitor its performance.
These cookies may be set through our site by our advertising partners. They may be used by those companies to build a profile of your interests and show you relevant adverts on other sites. They do not store directly personal information, but are based on uniquely identifying your browser and internet device. If you do not allow these cookies, you will experience less targeted advertising. Some 3rd party video providers do not allow video views without targeting cookies. If you are experiencing difficulty viewing a video, you will need to set your cookie preferences for targeting to yes if you wish to view videos from these providers. Unity does not control this.
These cookies are necessary for the website to function and cannot be switched off in our systems. They are usually only set in response to actions made by you which amount to a request for services, such as setting your privacy preferences, logging in or filling in forms. You can set your browser to block or alert you about these cookies, but some parts of the site will not then work. These cookies do not store any personally identifiable information.