Version: 2017.3 (switch to 2017.4)
LanguageEnglish
  • C#
  • JS

Script language

Select your preferred scripting language. All code snippets will be displayed in this language.

GraphicsSettings.lightsUseLinearIntensity

Suggest a change

Success!

Thank you for helping us improve the quality of Unity Documentation. Although we cannot accept all submissions, we do read each suggested change from our users and will make updates where applicable.

Close

Submission failed

For some reason your suggested change could not be submitted. Please <a>try again</a> in a few minutes. And thank you for taking the time to help us improve the quality of Unity Documentation.

Close

Cancel

public static var lightsUseLinearIntensity: bool;
public static bool lightsUseLinearIntensity;

Description

If this is true, Light intensity is multiplied against linear color values. If it is false, gamma color values are used.

Light intensity is multiplied by the linear color value when lightsUseLinearIntensity is enabled. This is physically correct but not the default for 3D projects created with Unity 5.6 and newer. By default lightsUseLinearIntensity is set to false.

2D projects will have lightsUseLinearIntensity set to disabled by default. When disabled, the gamma color value is multiplied with the intensity. If you want to use lightsUseColorTemperature, lightsUseLinearIntensity has to be enabled to ensure physically correct output.

If you enable lightsUseLinearIntensity on an existing project, all your Lights will need to be tweaked to regain their original look.

See Also: GraphicsSettings.lightsUseColorTemperature, Light.colorTemperature.

Did you find this page useful? Please give it a rating: