Version: 2022.1
How to get, set, and configure the active render pipeline
Using the Built-in Render Pipeline

Choosing and configuring a render pipeline and lighting solution

This guide is an updated version of the following Unity blog post: Spotlight Team Best Practices: Setting up the Lighting Pipeline - Pierre Yves Donzallaz.

Definitions

First, let’s go through the definitions of several important graphics rendering terms that you will encounter frequently in this article.

  • A render pipelineA series of operations that take the contents of a Scene, and displays them on a screen. Unity lets you choose from pre-built render pipelines, or write your own. More info
    See in Glossary
    determines how the objects in your sceneA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
    See in Glossary
    are displayed, in three main stages.
    • The first step is culling; it lists the objects that need to be rendered, preferably the ones visible to the cameraA component which creates an image of a particular viewpoint in your scene. The output is either drawn to the screen or captured as a texture. More info
      See in Glossary
      (frustum culling) and unoccluded by other objects (occlusion cullingA that disables rendering of objects when they are not currently seen by the camera because they are obscured (occluded) by other objects. More info
      See in Glossary
      ).
    • The second stage, rendering, is the drawing of these objects, with the correct lighting and some of their properties, into pixel-based buffers.
    • Finally, post-processingA process that improves product visuals by applying filters and effects before the image appears on screen. You can use post-processing effects to simulate physical camera and film properties, for example Bloom and Depth of Field. More info post processing, postprocessing, postprocess
      See in Glossary
      operations can be carried out on these buffers, for instance applying color grading, bloom and depth of fieldA post-processing effect that simulates the focus properties of a camera lens. More info
      See in Glossary
      , to generate the final output frame that is sent to a display device.

These operations are repeated many times a second, depending on the frame rate.

  • A ShaderA program that runs on the GPU. More info
    See in Glossary
    is a generic name for a program, or a collection of programs, running on the Graphics Processing Unit (GPU). For instance, after the culling stage is completed, a Vertex ShaderA program that runs on each vertex of a 3D model when the model is being rendered. More info
    See in Glossary
    is used to transform the vertex coordinates of the visible objects from “object space” into a different space called “clip space”; these new coordinates are then used by the GPU to rasterize the scene, i.e. convert the vectorial representation of the scene into actual pixels. At a later stage, these pixels will be colored by pixelThe smallest unit in a computer image. Pixel size depends on your screen resolution. Pixel lighting is calculated at every screen pixel. More info
    See in Glossary
    (or fragment) shaders; the pixel color will generally depend on the material properties of the respective surface and the surrounding lighting. Another common type of shader available on modern hardware is Compute Shaders: they allow programmers to exploit the considerable parallel processing power of GPUs for any kind of mathematical operations, such as light culling, particle physics, or volumetric simulation.
  • Direct lighting refers to lighting which originates from a self-emitting source of light, such as a light bulb, and isn’t the result of light bouncing off a surface. Depending on the size of the light source and its distance to the receiver, such lighting typically produces clear distinct shadows.
    • Direct lighting should not be confused with directional lighting, which is light emitted by an infinitely-distant light source (e.g. the computer-simulated sun). The noticeable properties of a directional light are the ability to cover the entire scene with parallel light rays, and the lack of distance falloff (or light decay); that is, the amount of lighting received does not decay as the distance to the light source increases.
    • In reality, the sunlight, like any other source of light, falls off over distance, based on the inverse-square law. Simply put, the amount of received light drops quickly when increasing the distance between the receiver and the light source. For example, the illuminance on Mercury is almost 7 times higher than on Earth, and Mars receives nearly half of Earth’s sunshine, whereas Pluto enjoys a mere 0.06%. Nevertheless, for most real-time applications with a limited altitude range, the sunlight decay is insignificant. Therefore, the directional light is perfectly adequate to simulate sunlight in most Unity scenes, including large, planet-centric, open worlds.
  • Indirect lighting results from light bouncing off surfaces and being transmitted and scattered through a medium, such as the atmosphere or translucent materials. Under these conditions, occluders generally cast soft or indiscernible shadows.
  • Global illuminationA group of techniques that model both direct and indirect lighting to provide realistic lighting results.
    See in Glossary
    (GI) is a group of techniques that model both direct and indirect lighting to provide realistic lighting results. There are several methods for GI, such as baked/dynamic lightmapsA pre-rendered texture that contains the effects of light sources on static objects in the scene. Lightmaps are overlaid on top of scene geometry to create the effect of lighting. More info
    See in Glossary
    , irradiance volumes, light propagation volumes, baked/dynamic light probes, voxel-based GI, and distance field-based GI. Out of the box, Unity supports baked/dynamic lightmaps and light probes.
  • A lightmapperA tool in Unity that bakes lightmaps according to the arrangement of lights and geometry in your scene. More info
    See in Glossary
    is the underlying system that generates the data for the lightmaps and light probes by shooting light rays, calculating the light bounces, and applying the resulting lighting into textures. Different lightmappers will therefore often produce different lighting looks, as they might rely on different techniques to produce the lighting data.

Overview

The following flowchart provides a high-level perspective of the entire lighting pipeline in Unity, from the point of view of a content creator.

You start by selecting a render pipeline. Then you decide how the indirect lighting is generated and pick a Global Illumination system accordingly. After you’ve made sure all the global lighting settings are tuned appropriately for your project, you can continue adding Lights, Emissive Surfaces, Reflection ProbesA rendering component that captures a spherical view of its surroundings in all directions, rather like a camera. The captured image is then stored as a Cubemap that can be used by objects with reflective materials. More info
See in Glossary
, Light ProbesLight probes store information about how light passes through space in your scene. A collection of light probes arranged within a given space can improve lighting on moving objects and static LOD scenery within that space. More info
See in Glossary
, and Light Probe Proxy Volumes (LPPVs). Detailing the usage and features of all these lighting objects is beyond the scope of this article, therefore I encourage you to read the Lighting section of the manual to learn how to utilize them correctly in your projects.

Render pipelines

Until early 2018, only one render pipeline was available in Unity; the Built-In Render Pipeline. This render pipeline offers a choice of rendering pathsThe technique that a render pipeline uses to render graphics. Choosing a different rendering path affects how lighting and shading are calculated. Some rendering paths are more suited to different platforms and hardware than others. More info
See in Glossary
: forward, and deferred.

  • When using the (multi-pass) forward rendering path, all objects in the scene are rendered one by one sequentially, potentially in multiple passes, depending on the number of lights affecting each object, thus the rendering cost can dramatically increase when objects are lit by multiple lights. This type of renderer commonly offers a wide variety of shaders and can handle transparency easily.
  • When using the deferred rendering path, all the (opaque) geometries are first rendered into buffers that store information about their materials (color, specular, smoothness, etc.). In a later pass (hence “deferred”), each pixel is shaded sequentially: the rendering time will depend mainly on the number of lights affecting each pixel. The transparent objects, and certain objects with complex shaders, will still require additional forward renderingA rendering path that renders each object in one or more passes, depending on lights that affect the object. Lights themselves are also treated differently by Forward Rendering, depending on their settings and intensity. More info
    See in Glossary
    passes. Deferred rendering is usually recommended when dealing with scenes containing many dynamic lights, such as artificially lit interiors or projects with a combination of outdoor and indoor lighting.

In January 2018, Unity unveiled the Scriptable Render Pipeline (SRP), which allows you to customize the rendering loop via C# scripting. This is actually a minor revolution in the realm of game engines: users are finally able to personalize the culling of objects, their drawing, and the post-processing of the frame without having to use a low-level programming language like C++.

Unity currently provides two pre-built SRPs:

  • The High-Definition Render Pipeline (HDRP) is a hybrid deferred/forward tile/cluster renderer. It offers advanced rendering and shading features and is designed for PC and advanced console projects that require a high degree of visual fidelity.

A tile is a small 2-dimensional square pixel section of the frame, and a cluster is a 3-dimensional volume inside the camera frustum. Both the tile and cluster rendering techniques rely on the listing of the lights affecting every single tile and cluster, whose lighting can then be computed in one single pass with the corresponding list of known lights. Opaque objects will most likely be shaded using the tile system, whereas transparent ones will rely on the cluster system. The main advantage is faster processing of the lighting and the considerable reduction in bandwidth consumption compared to the Built-In Render Pipeline (deferred), which depends on much slower multi-pass light accumulation.

  • The Universal Render Pipeline (URP) is a fast single-pass forward renderer; it has been designed primarily for lower-end devices lacking support for compute shader technology, such as older smartphones, tablets and XR devices. However, URP can also deliver higher-quality graphics for midrange devices such as consoles and PC, sometimes for a lower performance cost than the Built-In Render Pipeline. The lights are culled per-object and allow for the lighting to be computed in one single pass, which results in reduced draw calls compared to the Built-In Render Pipeline. Finally, URP also offers a 2D Renderer, and a Deferred renderer.

You can use the following decision chart to quickly find out which render pipeline you should select based on a few critical criteria.

Setup

You can download the latest versions of HDRP and URP via the Unity Package Manager (Window > Package Manager). The easiest way to get started with one of these SRPs is to create a new project with the Unity Hub and use one of the corresponding templates.

If you want to set up your project for HDRP, ensure you have the required package installed. Then use the HD Render Pipeline Wizard (Window > Render Pipeline > HD Render Pipeline Wizard) to set up your project in one click.

Extensibility

If you have some rendering knowledge, are familiar with C#, and need to fully tailor the renderer for your Project, you can experiment with the SRP concept to create your own Custom Scriptable Render Pipeline. The Universal Render Pipeline is especially easy to extend, due to its smaller shader library and the ability to inject, remove and swap rendering passes rapidly.

Compatibility

Porting your project’s materials from the Built-In Render Pipeline to HDRP or to URP is relatively easy in Unity, thanks to a 1-click material converter under Edit > Render Pipeline > Upgrade…. Note that it is a non-reversible action. Backing up your project beforehand is highly recommended!

Nevertheless, custom shaders will have to be ported by hand, so transitioning from the Built-In Render Pipeline to HDRP or URP during production might be time-consuming, depending on the number of custom shaders you would have to rewrite.

Additionally, because the HDRP is more physically correct than the Built-In Render Pipeline, especially regarding light attenuation and distribution, you should not expect your project to look identical after switching to HDRP.

Furthermore, HDRP and URP are not cross-compatible, as they do not share the same rendering features. Porting your project from HDRP to URP and vice versa is possible, but it is not a 1-click operation and will require manual rework of the lighting, the materials, and the shaders!

Global Illumination systems

Two of the Global Illumination systems available in Unity are:

  1. Realtime Global Illumination: This system is built on EnlightenA lighting system by Geomerics used in Unity for lightmapping and for Enlighten Realtime Global Illumination. More info
    See in Glossary
    , a third-party middleware solution. It enables you to adjust your lighting in real-time if you do a precompute and do not modify GameObjectsThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
    See in Glossary
    in your scene with the ContributeGI setting enabled. See the HDRP and URP for compatibility information specific to scriptable render pipelines. Unless otherwise specified, the Built-In Render Pipeline supports all features described in this article.

  2. Baked Global Illumination: When you use this system, Unity bakes lighting data into textures called lightmaps, and into Light Probes and into reflection probes. There are two lightmappers: Enlighten Baked Global Illumination (deprecated) and the Progressive Lightmapper (CPU or GPU). See the HDRP and URP documentation for compatibility information specific to scriptable render pipelines. Unless otherwise specified, the Built-In Render Pipeline supports all features described in this article.

The Progressive Lightmapper calculates indirect lighting values using path tracing. It can prioritize precomputing lighting that affects objects visible to the scene viewAn interactive view into the world you are creating. You use the Scene View to select and position scenery, characters, cameras, lights, and all other types of Game Object. More info
See in Glossary
camera. Although only updating lighting for parts of lightmaps increases the overall bake time, it also enables you to more quickly iterate on your lighting design.

For more information about lighting features in the Scriptable Render Pipelines, see the Universal Render Pipeline and High Definition Render Pipeline Documentation.

Static versus Dynamic

No matter which Global Illumination system you use, Unity will only consider objects that are marked as “Contribute GI” during the baking/precomputing of the lighting. Dynamic (i.e. non-static) objects have to rely on the Light Probes you placed throughout the scene to receive indirect lighting.

Because the baking/precomputing of the lighting is a relatively slow process, only large and complex assets with distinct lighting variations, such as concavity and self-shadowing, should be tagged as “Contribute GI”. Smaller and convex meshes that receive homogeneous lighting should not be marked as such, and they should, therefore, receive indirect lighting from the Light Probes which store a simpler approximation of the lighting. Larger dynamic objects can rely on LPPVs, in order to receive better localized indirect lighting. Limiting the number of objects tagged as “Contribute GI” in your scene is absolutely crucial to minimize baking times while maintaining an adequate lighting quality. You can learn more about this optimization process and the importance of Probe lighting in this tutorial.

Warning

The Unity Editor and Player allow you to use both Enlighten Realtime Global Illumination and baked lighting at the same time.

However, simultaneously enabling these features greatly increases baking time and memory usage at runtime, because they do not use the same data sets. You can expect visual differences between indirect light you have baked and indirect light provided by Enlighten Realtime Global Illumination, regardless of the lightmapper you use for baking. This is because Enlighten Realtime Global Illumination often operates at a significantly different resolution than Unity’s baking backends, and relies on different techniques to simulate indirect lighting.

If you wish to use both Enlighten Realtime Global Illumination and baked lighting at the same time, limit your simultaneous use of both global illumination systems to high-end platforms and/or to projects that have tightly controlled scenes with predictable costs. Only expert users who have a very good understanding of all lighting settings can effectively use this approach. Consequently, picking one of the two global illumination systems is usually a safer strategy for most projects. Using both systems is rarely recommended.

Light Modes

The Mode property of a Light component is a common source of confusion.

There are three Light ModesA Light property that defines the use of the Light. Can be set to Realtime, Baked and Mixed. More info
See in Glossary
available in the Light Inspector:

  1. Baked: The direct and indirect lighting from these lights is baked into lightmaps, which can be a time-consuming process. There is no runtime cost to process these lights, however applying the resulting lightmaps to the scene does have a minor cost.
  2. Realtime: The direct lighting and shadows from these lights are real-time and therefore not baked into lightmaps. Their runtime cost can be high, depending on the complexity of the scene, the number of shadow casting lights, the number of overlapping lights, etc. Furthermore, if you enable Enlighten Realtime Global Illumination, further performance costs will be incurred to update the indirect lighting at runtime.
  3. Mixed: This is a hybrid mode that offers a mix of baked and real-time features, such as baked indirect lighting and real-time direct lighting. The behavior of all Mixed lights in your Scene and their performance impact depends on the Lighting Mode for that Scene.

It is important to note that the mode of a light is only relevant if the Baked Global Illumination system is enabled. If you do not use any global illumination system or only use Enlighten Realtime Global Illumination system, then all Baked and Mixed lights will behave as though their Mode property was set to Realtime.

The following diagram combines a decision flowchart with a comparison table; it can help you decide which light mode is appropriate every time a new light is added into the scene.

Lighting Modes

As you can see in the previous diagram, all Mixed Lights in a Scene have specific baked and real-time capabilities, depending on the Lighting Mode that you picked in the Lighting window.

There are three modes to choose from:

  1. Subtractive
  2. Baked Indirect
  3. Shadowmask

Shadowmask Lighting Mode has two quality settings:

  1. Shadowmask
  2. Distance Shadowmask

When using HDRP’s Shadowmask Lighting Mode, the ShadowmaskA Texture that shares the same UV layout and resolution with its corresponding lightmap. More info
See in Glossary
feature is enabled in the HDRP Asset assigned in the Graphics settings; it then has to be activated specifically for your camera(s) via the Frame Settings.

Render pipeline comparison table

See the HDRP and URP documentation for compatibility information specific to scriptable render pipelines. Unless otherwise specified, the Built-In Render Pipeline supports all features described in this article.

Lighting scenarios

Now that we have introduced the render pipelines and the main lighting features, let’s have a look at a few examples of projects and see which settings could be used to light them. Since every project is unique, you might use slightly different options based on your requirements.

1. Prototype or quick previsualization

If you rely heavily on the Asset StoreA growing library of free and commercial assets created by Unity and members of the community. Offers a wide variety of assets, from textures, models and animations to whole project examples, tutorials and Editor extensions. More info
See in Glossary
to build your prototype, the Built-In Render Pipeline could be the only suitable render pipeline, as most assets found on the Store are not fully compatible with HDRP and URP; nonetheless, asset compatibility will improve over time. If you are building all the assets from the ground up and already have a clear idea of your project’s requirements, then you could pick one of the two SRPs (i.e. URP or HDRP) or even create a custom one.

When you are in the early stage of (pre-)production and need a quick turnaround and maximum flexibility for the lighting, you might prefer a full real-time approach that does not require any precomputation, therefore you might want to turn off both Baked Global Illumination and Enlighten Realtime Global Illumination. To alleviate the lack of proper indirect lighting, you can enable Screen Space Ambient OcclusionA method to approximate how much ambient light (light not coming from a specific direction) can hit a point on a surface.
See in Glossary
: it can help ground the object in the scene by offering cheap real-time contact shadows.

2. 3D Mobile strategy game

If you are targeting mobile devices, URP could be a great candidate to ensure solid performance for your game. It is in many cases possible to customise URP to suit your game’s specific needs, with help from a graphics programmer.

The Built-In Render Pipeline and URP both support Shadowmask Lighting Mode which makes it possible for you to bake shadows for static objects while still enabling dynamic objects to cast real-time shadows. If Shadowmasks are too expensive for your project, you can fall back to the cheapest Subtractive mode. Finally, the forward rendering path is probably the best option if you have a very small number of lights in your level(s), and if you’re targeting older hardware.

3. AAA corridor shooter (fixed time of day)

If you are aiming for AAA-quality visuals on PC and consoles for your linear first-person shooter, HDRP should be the preferred render pipeline. Again, with the help of graphics programmers, a custom SRP could also be developed.

If your levels contain many real-time shadow casting lights (e.g. destructible light props and moving lights), then using the Baked Global Illumination system with the Baked Indirect mode should ensure you get great looking indirect lighting from the Mixed directional light and the Baked lightsLight components whose Mode property is set to Baked. Unity pre-calculates the illumination from Baked Lights before runtime, and does not include them in any runtime lighting calculations. More info
See in Glossary
in static light props. If your levels consist of a larger proportion of fixed shadow casting lights, then an approach with Shadowmasks could be recommended because HDRP offers a great hybrid Shadowmask mode which gives you more control over the blend between real-time and baked shadows.

If you also plan to support the Nintendo Switch, then using URP would be recommended, so that you can support most gaming platforms on the market and not having to go through the potentially tedious process of porting your project from HDRP to URP, or vice versa.

4. Battle Royale (day-night cycle)

If you plan to release a battle royale game for PC and consoles, that features large-scale environments and fully dynamic lighting, you should select HDRP, or extend it to tailor the rendering pipeline to your project. You could consider URP if you are not aiming for AAA visual fidelity and are targeting mobile devices or systems with lower specifications.

For this particular scenario, if you are using the Built-in Render Pipeline, activating both the Enlighten Realtime Global Illumination and a Baked Global Illumination system is not recommended, because the resulting overhead in terms of performance and scene management for an immense level could be problematic. Another argument against the use of both global illumination systems is the unpredictable nature of such large-scale multiplayer games: performance estimations are for instance more difficult than in a highly-scripted linear level.

Final words

The rendering landscape has changed radically in Unity over the past few years, thanks to the introduction of the Scriptable Render Pipelines. Therefore, keeping up with all these changes and their implications for the lighting pipeline can be exhausting.

Hopefully, this guide and its many illustrations have given you a better understanding of the capabilities of each Render Pipeline so that you can confidently start your projects in Unity with the appropriate rendering and lighting settings!

You can learn more about the lighting in Unity and the rendering pipelines with the following pages:

How to get, set, and configure the active render pipeline
Using the Built-in Render Pipeline