Legacy Documentation: Version 2017.2 (Go to current version)
  • C#
  • JS

Script language

Select your preferred scripting language. All code snippets will be displayed in this language.


Suggest a change


Thank you for helping us improve the quality of Unity Documentation. Although we cannot accept all submissions, we do read each suggested change from our users and will make updates where applicable.


Submission failed

For some reason your suggested change could not be submitted. Please <a>try again</a> in a few minutes. And thank you for taking the time to help us improve the quality of Unity Documentation.



public static var lightsUseLinearIntensity: bool;
public static bool lightsUseLinearIntensity;


If this is true, Light intensity is multiplied against linear color values. If it is false, gamma color values are used.

Light intensity is multiplied by the linear color value when lightsUseLinearIntensity is enabled. This is physically correct but not the default for 3D projects created with Unity 5.6 and newer. By default lightsUseLinearIntensity is set to false.

2D projects will have lightsUseLinearIntensity set to disabled by default. When disabled, the gamma color value is multiplied with the intensity. If you want to use lightsUseColorTemperature, lightsUseLinearIntensity has to be enabled to ensure physically correct output.

If you enable lightsUseLinearIntensity on an existing project, all your Lights will need to be tweaked to regain their original look.

See Also: GraphicsSettings.lightsUseColorTemperature, Light.colorTemperature.

Did you find this page useful? Please give it a rating: