Version: 2020.1
Believable visuals: dynamic lighting

Setting up the Rendering Pipeline and Lighting in Unity

This guide is an updated version of the following Unity blog post: Spotlight Team Best Practices: Setting up the Lighting Pipeline - Pierre Yves Donzallaz.


First, let’s go through the definitions of several important graphics rendering terms that you will encounter frequently in this article.

  • A render pipeline determines how the objects in your scene are displayed, in three main stages.
    • The first step is culling; it lists the objects that need to be rendered, preferably the ones visible to the camera (frustum culling) and unoccluded by other objects (occlusion culling).
    • The second stage, rendering, is the drawing of these objects, with the correct lighting and some of their properties, into pixel-based buffers.
    • Finally, post-processing operations can be carried out on these buffers, for instance applying color grading, bloom and depth of field, to generate the final output frame that is sent to a display device.

These operations are repeated many times a second, depending on the frame rate.

  • A Shader is a generic name for a program, or a collection of programs, running on the Graphics Processing Unit (GPU). For instance, after the culling stage is completed, a Vertex Shader is used to transform the vertex coordinates of the visible objects from “object space” into a different space called “clip space”; these new coordinates are then used by the GPU to rasterize the scene, i.e. convert the vectorial representation of the scene into actual pixels. At a later stage, these pixels will be colored by pixel (or fragment) shaders; the pixel color will generally depend on the material properties of the respective surface and the surrounding lighting. Another common type of shader available on modern hardware is Compute Shaders: they allow programmers to exploit the considerable parallel processing power of GPUs for any kind of mathematical operations, such as light culling, particle physics, or volumetric simulation.
  • Direct lighting refers to lighting which originates from a self-emitting source of light, such as a light bulb, and isn’t the result of light bouncing off a surface. Depending on the size of the light source and its distance to the receiver, such lighting typically produces clear distinct shadows.
    • Direct lighting should not be confused with directional lighting, which is light emitted by an infinitely-distant light source (e.g. the computer-simulated sun). The noticeable properties of a directional light are the ability to cover the entire scene with parallel light rays, and the lack of distance falloff (or light decay); that is, the amount of lighting received does not decay as the distance to the light source increases.
    • In reality, the sunlight, like any other source of light, falls off over distance, based on the inverse-square law. Simply put, the amount of received light drops very quickly when increasing the distance between the receiver and the light source. For example, the illuminance on Mercury is almost 7 times higher than on Earth, and Mars receives nearly half of Earth’s sunshine, whereas Pluto enjoys a mere 0.06%. Nevertheless, for most real-time applications with a very limited altitude range, the sunlight decay is insignificant. Therefore, the directional light is perfectly adequate to simulate sunlight in most Unity scenes, including large, planet-centric, open worlds.
  • Indirect lighting results from light bouncing off surfaces and being transmitted and scattered through a medium, such as the atmosphere or translucent materials. Under these conditions, occluders generally cast soft or indiscernible shadows.
  • Global illumination (GI) is a group of techniques that model both direct and indirect lighting to provide realistic lighting results. There are several methods for GI, such as baked/dynamic lightmaps, irradiance volumes, light propagation volumes, baked/dynamic light probes, voxel-based GI, and distance field-based GI. Out of the box, Unity supports baked/dynamic lightmaps and light probes.
  • A lightmapper is the underlying system that generates the data for the lightmaps and light probes by shooting light rays, calculating the light bounces, and applying the resulting lighting into textures. Different lightmappers will therefore often produce different lighting looks, as they might rely on different techniques to produce the lighting data.


次のフローチャートは、コンテンツ作成者の視点で、Unity のすべてのライティングパイプライン全体を高レベルから見たものです。

まず、レンダリングパイプラインを選択します。次に、間接光をどのように生成するかを決定し、それに応じてグローバルイルミネーション (GI) システムを選択します。すべてのグローバルライティング設定がプロジェクトに適切に調整されていることを確認した後、ライトエミッシブサーフェスリフレクションプローブライトプローブLight Probe Proxy Volume (LPPV) を加えます。これらすべてのライティングオブジェクトの使用法や機能の詳細はこのページのスコープを超えています。ですから、マニュアルのライティングのセクションを読んで、プロジェクトで正しく活用する方法を学ぶことをお勧めします。


Until early 2018, only one render pipeline was available in Unity; the Built-In Render Pipeline. This render pipeline offers a choice of rendering paths: forward, and deferred.

  • When using the (multi-pass) forward rendering path, all objects in the scene are rendered one by one sequentially, potentially in multiple passes, depending on the number of lights affecting each object, thus the rendering cost can dramatically increase when objects are lit by multiple lights. This type of renderer commonly offers a wide variety of shaders and can handle transparency easily.
  • When using the deferred rendering path, all the (opaque) geometries are first rendered into buffers that store information about their materials (color, specular, smoothness, etc.). In a later pass (hence “deferred”), each pixel is shaded sequentially: the rendering time will depend mainly on the number of lights affecting each pixel. The transparent objects, and certain objects with complex shaders, will still require additional forward rendering passes. Deferred rendering is usually recommended when dealing with scenes containing many dynamic lights, such as artificially lit interiors or projects with a combination of outdoor and indoor lighting.

In January 2018, Unity unveiled the Scriptable Render Pipeline (SRP), which allows you to customize the rendering loop via C# scripting. This is actually a minor revolution in the realm of game engines: users are finally able to personalize the culling of objects, their drawing, and the post-processing of the frame without having to use a low-level programming language like C++.

Unity currently provides two pre-built SRPs:

  • The High-Definition Render Pipeline (HDRP) is a hybrid deferred/forward tile/cluster renderer. It offers advanced rendering and shading features and is designed for PC and advanced console projects that require a high degree of visual fidelity.

A tile is a small 2-dimensional square pixel section of the frame, and a cluster is a 3-dimensional volume inside the camera frustum. Both the tile and cluster rendering techniques rely on the listing of the lights affecting every single tile and cluster, whose lighting can then be computed in one single pass with the corresponding list of known lights. Opaque objects will most likely be shaded using the tile system, whereas transparent ones will rely on the cluster system. The main advantage is faster processing of the lighting and the considerable reduction in bandwidth consumption compared to the Built-In Render Pipeline (deferred), which depends on much slower multi-pass light accumulation.

  • The Universal Render Pipeline (URP) is a fast single-pass forward renderer; it has been designed primarily for lower-end devices lacking support for compute shader technology, such as older smartphones, tablets and XR devices. However, URP can also deliver higher-quality graphics for midrange devices such as consoles and PC, sometimes for a lower performance cost than the Built-In Render Pipeline. The lights are culled per-object and allow for the lighting to be computed in one single pass, which results in reduced draw calls compared to the Built-In Render Pipeline. Finally, URP also offers a 2D Renderer, and a Deferred renderer is planned.

You can use the following decision chart to quickly find out which render pipeline you should select based on a few critical criteria.


You can download the latest versions of HDRP and URP via the Unity Package Manager (Window > Package Manager). The easiest way to get started with one of these SRPs is to create a new project with the Unity Hub and use one of the corresponding templates.

If you want to set up your project for HDRP, ensure you have the required package installed. Then use the HD Render Pipeline Wizard (Window > Render Pipeline > HD Render Pipeline Wizard) to set up your project in one click.


If you have some rendering knowledge, are familiar with C#, and need to fully tailor the renderer for your Project, you can experiment with the SRP concept to create your own Custom Scriptable Render Pipeline. The Universal Render Pipeline is especially easy to extend, due to its smaller shader library and the ability to inject, remove and swap rendering passes rapidly.


Porting your project’s materials from the Built-In Render Pipeline to HDRP or to URP is relatively easy in Unity, thanks to a 1-click material converter under Edit > Render Pipeline > Upgrade…. Note that it is a non-reversible action. Backing up your project beforehand is highly recommended!

Nevertheless, custom shaders will have to be ported by hand, so transitioning from the Built-In Render Pipeline to HDRP or URP during production might be time-consuming, depending on the number of custom shaders you would have to rewrite.

Additionally, because the HDRP is more physically correct than the Built-In Render Pipeline, especially regarding light attenuation and distribution, you should not expect your project to look identical after switching to HDRP.

Furthermore, HDRP and URP are not cross-compatible, as they do not share the same rendering features. Porting your project from HDRP to URP and vice versa is possible, but it is not a 1-click operation and will require manual rework of the lighting, the materials, and the shaders!


If you want to include indirect lighting in your Scene, you must use one of Unity’s two Global Illumination systems, or generate it using your own baking solution. The two systems available in Unity, under Window > Rendering > Lighting, are:

  1. Realtime Global Illumination: This system entirely relies on Enlighten, a third-party lighting middleware. During the precomputation in Unity, Enlighten goes through two lengthy stages, among others: Clustering and Light Transport. The first one consists in simplifying the scene into a collection of surface patches called clusters, and the second, in calculating the visibility between these clusters. This precomputed data is used at runtime to generate the indirect lighting interactively. The strength of Enlighten relies on the ability to edit the lighting in realtime, as the precomputed data relies on the relationship between clusters. However, like in other traditional lightmapping techniques, editing the static geometries in your scene will trigger a new precomputation. Enlighten is in the process of being removed from Unity and a new solution is being researched.
    • HDRP does not support Realtime Global Illumination for new projects in Unity 2019.3 and beyond. Nonetheless, projects created prior to Unity 2019.3 can still be upgraded to 2019.3 or 2019 LTS.
    • URP has never supported Realtime Global Illumination using Enlighten.
    • The Built-in Render Pipeline will support Realtime Global Illumination using Enlighten until Unity 2020 LTS (end of 2020 or early 2021). This means critical bug fixes will continue for this version until the end of 2022 or early 2023.

To summarize, if you are starting a new project in Unity 2019.3 or later, Enlighten will not be available if you use URP or HDRP. If you choose the Built-in Render Pipeline, Enlighten will remain available until then end of 2020/early 2021.

  1. Baked Global Illumination: The lighting is baked into textures called lightmaps, and into Light Probes. The Baked GI system can use one of the following lightmappers:

The Progressive Lightmapper can prioritize the computation of the lighting for objects visible to the camera and greatly speed up the iteration on the lighting, at the cost of increasing the overall baking time for the entire scene. The Progressive Lightmapper uses the CPU to calculate the indirect lighting using path tracing. A new GPU Progressive Lightmapper is currently in preview, and will radically reduce the baking time for your scenes.

Because both Enlighten and the Progressive Lightmapper use different methods to produce the baked lighting, you should not expect the resulting lighting to match exactly when comparing them.

Have a look at the diagram below to decide which Global Illumination system is recommended for your project, as well as its main advantages and disadvantages.

静的 vs 動的

No matter which Global Illumination system you use, Unity will only consider objects that are marked as “Contribute GI” during the baking/precomputing of the lighting. Dynamic (i.e. non-static) objects have to rely on the Light Probes you placed throughout the scene to receive indirect lighting.

Because the baking/precomputing of the lighting is a relatively slow process, only large and complex assets with distinct lighting variations, such as concavity and self-shadowing, should be tagged as “Contribute GI”. Smaller and convex meshes that receive homogeneous lighting should not be marked as such, and they should, therefore, receive indirect lighting from the Light Probes which store a simpler approximation of the lighting. Larger dynamic objects can rely on LPPVs, in order to receive better localized indirect lighting. Limiting the number of objects tagged as “Contribute GI” in your scene is absolutely crucial to minimize baking times while maintaining an adequate lighting quality. You can learn more about this optimization process and the importance of Probe lighting in this tutorial.


Unity ではベイクされた GI システムとリアルタイム GI システムの両方を同時にアクティブにすることができ、これによりすべてのライティング機能にアクセス可能です。ただし、両方のシステムを有効にすると、同じデータセットに依存しないため、ランタイム時のベイク時間とメモリ使用量が大幅に増加することに注意する必要があります。さらに、ランタイムに間接光をインタラクティブに更新すると、CPU に負担がかかります。また、ベイクとリアルタイム GI システムが作成する間接光を視覚的に比較すると、両システムは間接光をシミュレートするのに異なるテクニックを使用し、しばしば著しく異なる解像度で操作することがあるため、矛盾が生じる場合があります。

両方の GI システムを使用することは、ハイエンドプラットフォームや、コストが予測可能でシーンを厳密に制御できるプロジェクトに制限する必要があります。このアプローチは、両方のシステムの管理は著しく複雑さを増すため、すべてのライティング設定を深く理解している熟練ユーザーのみに適しています。結果的に、2 つの GI システムのうちの 1 つを使用することが、通常、ほとんどのプロジェクトにとってより安全な方法と言えます。両方のシステムを使用することは、ほとんどの場合、推奨されません。

Light Modes

The Mode property of a Light component is a common source of confusion.

There are three Light Modes available in the Light Inspector:

  1. Baked: The direct and indirect lighting from these lights is baked into lightmaps, which can be a time-consuming process. There is no runtime cost to process these lights, however applying the resulting lightmaps to the scene does have a minor cost.
  2. Realtime: The direct lighting and shadows from these lights are real-time and therefore not baked into lightmaps. Their runtime cost can be high, depending on the complexity of the scene, the number of shadow casting lights, the number of overlapping lights, etc. Furthermore, if you enable Realtime Global Illumination, further performance costs will be incurred to update the indirect lighting at runtime.
  3. Mixed: This is a hybrid mode that offers a mix of baked and real-time features, such as baked indirect lighting and real-time direct lighting. The behavior of all Mixed lights in your Scene and their performance impact depends on the Lighting Mode for that Scene.

It is very important to note that the mode of a light is only relevant if the Baked Global Illumination system is enabled. If you do not use any GI system or only use the Realtime GI system, then all Baked and Mixed lights will behave as though their Mode property was set to Realtime.


Lighting Modes

As you can see in the previous diagram, all Mixed Lights in a Scene have specific baked and real-time capabilities, depending on the Lighting Mode that you picked in the Lighting window.

There are three modes to choose from:

  1. Subtractive
  2. Baked Indirect
  3. Shadowmask

Shadowmask Lighting Mode has two quality settings:

  1. Shadowmask
  2. Distance Shadowmask

When using HDRP’s Shadowmask Lighting Mode, the Shadowmask feature is enabled in the HDRP Asset assigned in the Graphics settings; it then has to be activated specifically for your camera(s) via the Frame Settings.

Render pipeline comparison table

The following table gives you a general overview of the features supported by each Render Pipeline in Unity 2019.3.



1. プロトタイプまたは素早い事前可視化

If you rely heavily on the Asset Store to build your prototype, the Built-In Render Pipeline could be the only suitable render pipeline, as most assets found on the Store are not fully compatible with HDRP and URP; nonetheless, asset compatibility will improve over time. If you are building all the assets from the ground up and already have a clear idea of your project’s requirements, then you could pick one of the two SRPs (i.e. URP or HDRP) or even create a custom one.

When you are in the early stage of (pre-)production and need a quick turnaround and maximum flexibility for the lighting, you might prefer a full real-time approach that does not require any precomputation, therefore you might want to turn off both the Baked and Realtime GI systems. To alleviate the lack of proper indirect lighting, you can enable Screen Space Ambient Occlusion: it can help ground the object in the scene by offering cheap real-time contact shadows.

2. 3D モバイル戦略ゲーム

If you are targeting mobile devices, URP could be a great candidate to ensure solid performance for your strategy game. If the rendering pipeline needs to be customized to better suit your game, a graphics programmer will probably find extending URP straightforward. If you pick URP and use Baked Global Illumination, be aware that at the moment, the Shadowmask Mixed Lighting Mode is not supported.

Alternatively, if you decide to stick to the Built-In Render Pipeline because, for example, you rely on many assets from the Asset Store, all Mixed Lighting modes are supported. In this case, an approach with the Shadowmask Lighting Mode will provide baked shadows while still allowing dynamic objects to cast real-time shadows. If Shadowmasks are too expensive for your project, you can fall back to the cheapest Subtractive mode. Finally, the forward rendering path is probably the best option if you have a very small number of lights in your level(s), and if you’re targeting older hardware.

3. 優良画質の迷路のシューティングゲーム (1 日の固定された時刻)

If you are aiming for AAA-quality visuals on PC and consoles for your linear first-person shooter, HDRP should be the preferred render pipeline. Again, with the help of graphics programmers, a custom SRP could also be developed.

If your levels contain many real-time shadow casting lights (e.g. destructible light props and moving lights), then using the Baked GI system with the Baked Indirect mode should ensure you get great looking indirect lighting from the Mixed directional light and the Baked lights in static light props. If your levels consist of a larger proportion of fixed shadow casting lights, then an approach with Shadowmasks could be recommended because HDRP offers a great hybrid Shadowmask mode which gives you more control over the blend between real-time and baked shadows.

If you also plan to support the Nintendo Switch, then using URP would be recommended, so that you can support most gaming platforms on the market and not having to go through the potentially tedious process of porting your project from HDRP to URP, or vice versa.

4. Battle Royale (day-night cycle)

If you plan to release a battle royale game for PC and consoles, that features large-scale environments and fully dynamic lighting, you should select HDRP, or extend it to tailor the rendering pipeline to your project. You could consider URP if you are not aiming for AAA visual fidelity and are targeting mobile devices or systems with lower specifications.

In order to accommodate for the day-night cycle, as HDRP and URP do not support the Realtime Global Illumination system (Enlighten): the day-night cycle would have to be handled with an Indirect Bake and a custom script that would, for instance, modulate the sun and indirect light intensity throughout the day.

For this particular scenario, if you are using the Built-in Render Pipeline, activating both the Realtime GI and the Baked GI systems is not recommended, because the resulting overhead in terms of performance and scene management for an immense level could be critical. Another argument against the use of both GI systems is the unpredictable nature of such large-scale multiplayer games: performance estimations are for instance more difficult than in a highly-scripted linear level.


The rendering landscape has changed radically in Unity over the past few years, thanks to the introduction of the Scriptable Render Pipelines. Therefore, keeping up with all these changes and their implications for the lighting pipeline can be exhausting.

Hopefully, this guide and its many illustrations have given you a better understanding of the capabilities of each Render Pipeline so that you can confidently start your projects in Unity with the appropriate rendering and lighting settings!

You can learn more about the lighting in Unity and the rendering pipelines with the following pages:

Believable visuals: dynamic lighting