Version: 2020.1
Configuring your Unity Project for XR
XR API reference

Universal Render Pipeline compatibility in XR

Support for XRAn umbrella term encompassing Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) applications. Devices supporting these forms of interactive applications can be referred to as XR devices. More info
See in Glossary
features in the Universal Render Pipeline (URP) varies by URP package version. This page details compatibility between XR features in Unity 2020.1 and the latest compatible URP version.

To determine which version of URP is compatible with your current Unity version, see the Requirements and compatibility page in the Universal Render Pipeline documentation.

Unity 2020.1 supports the following ARAugmented Reality (AR) uses computer graphics or video composited on top of a live video feed to augment the view and create interaction with real and virtual objects.
See in Glossary
and VR features in the Universal Render Pipeline:

Feature Supported in XR
Post-processingA process that improves product visuals by applying filters and effects before the image appears on screen. You can use post-processing effects to simulate physical camera and film properties, for example Bloom and Depth of Field. More info post processing, postprocessing, postprocess
See in Glossary
effects: Bloom
Yes
Post-processing effects: MotionBlur No
Post-processing effects: Lens Distortion No
Post-processing effects: Depth of FieldA post-processing effect that simulates the focus properties of a camera lens. More info
See in Glossary
Yes
Post-processing effects: ToneMappingThe process of remapping HDR values of an image into a range suitable to be displayed on screen. More info
See in Glossary
Yes
Other post-processing effects (color adjustment, etc.) Yes
GI (Global Illumination) Yes
HDRhigh dymanic range
See in Glossary
renderingThe process of drawing graphics to the screen (or to a render texture). By default, the main camera in Unity renders its view to the screen. More info
See in Glossary
Yes (1)
MSAA Yes
Physical CameraA component which creates an image of a particular viewpoint in your scene. The output is either drawn to the screen or captured as a texture. More info
See in Glossary
No
CopyColor / ColorDepth Yes
Multi Display No
Camera Stacking Yes
Cascaded Shadow Yes
sRGB Yes
SkyboxA special type of Material used to represent skies. Usually six-sided. More info
See in Glossary
Yes
Fog Yes
BillboardA textured 2D object that rotates as it, or the Camera, moves so that it always faces the Camera. More info
See in Glossary
Yes
ShaderA small script that contains the mathematical calculations and algorithms for calculating the Color of each pixel rendered, based on the lighting input and the Material configuration. More info
See in Glossary
Graph
Yes (2)
Particles Yes
TerrainThe landscape in your scene. A Terrain GameObject adds a large flat plane to your scene and you can use the Terrain’s Inspector window to create a detailed landscape. More info
See in Glossary
Yes
2D UI(User Interface) Allows a user to interact with your application. More info
See in Glossary
(Canvas Renderer, Text Mesh Pro)
Yes
URP Debug (Scene View Mode, Frame Debug) Yes (3)

(1) HDR display is currently unsupported. (2) Although Shader Graph shaders can run in XR, Shader Graph doesn’t currently support the XR utility feature to create SPI-compatible shader input textures. Unity will expand support for Shader Graph functionality in future releases. (3) Unity supports frame debugging for mock HMDs. Currently, there is no support for Oculus.

To learn more about post-processing effects, see the Effect list page in the Universal Render Pipeline documentation.

Configuring your Unity Project for XR
XR API reference