Version: 2020.2
Practical guide to optimization for mobiles
Scripting and Gameplay Methods

Graphics Methods

What are mobile devices capable of? How should you plan your game accordingly? If your game runs slow, and the profilerA window that helps you to optimize your game. It shows how much time is spent in the various areas of your game. For example, it can report the percentage of time spent rendering, animating or in your game logic. More info
See in Glossary
indicates that it’s a renderingThe process of drawing graphics to the screen (or to a render texture). By default, the main camera in Unity renders its view to the screen. More info
See in Glossary
bottleneck, how do you know what to change, and how to make your game look good but still run fast? This page is dedicated to a general and non-technical exposition of the methods. If you are looking for the specifics, see the Rendering Optimizations page.

What you can reasonably expect to run on current consumer mobiles:

  • Lightmapped static geometry. But beware of:
    • Using a lot of alpha-test shaders
    • Bumpmapping, especially using built-in shadersA small script that contains the mathematical calculations and algorithms for calculating the Color of each pixel rendered, based on the lighting input and the Material configuration. More info
      See in Glossary
      .
    • High polygon count
  • Animated characters, even with fancy shaders! But beware of:
    • Massive crowds or high-poly characters
  • 2D games with spritesA 2D graphic objects. If you are used to working in 3D, Sprites are essentially just standard textures but there are special techniques for combining and managing sprite textures for efficiency and convenience during development. More info
    See in Glossary
    . But beware of:
    • Overdraw, or, lots of layers drawn on top of each other.
  • Particle effects. But beware of:
    • High density on large particles. (Lots of particles drawn on top of each other. This is another overdraw situation)
    • Ridiculous numbers of particles, or particle collidersAn invisible shape that is used to handle physical collisions for an object. A collider doesn’t need to be exactly the same shape as the object’s mesh - a rough approximation is often more efficient and indistinguishable in gameplay. More info
      See in Glossary
      .
  • Physics. But beware of:
    • Mesh collidersA free-form collider component which accepts a mesh reference to define its collision surface shape. More info
      See in Glossary
      .
    • Lots of active bodies.

What you CANNOT reasonably expect to run on current consumer mobiles:

  • Fullscreen screen image effects like glow and depth of fieldA post-processing effect that simulates the focus properties of a camera lens. More info
    See in Glossary
    .
  • Dynamic per-pixel lighting (multiple lights marked Important and not baked into the lightmap)
    • Every affected object is drawn an additional time for every dynamic light you use, and this gets slow quickly.
  • Real time shadows on everything
    • Unity has support for real time shadows on mobile platforms, but their use must be very judicious, and likely limited to higher-end devices.

Examples - How top-notch mobile games are made

Shadowgun

Shadowgun is an impressive example of what can be done on current mobile hardware. But more specifically, it’s a good example of what cannot be done, and how to get around the limitations. Especially because a small part of the game has been made publicly available in this blog post.

Here’s a basic rundown of things that Shadowgun does in order to keep performance up:

  • Dynamic lighting - barely used.
    • Blob shadows and LightmapsA pre-rendered texture that contains the effects of light sources on static objects in the scene. Lightmaps are overlaid on top of scene geometry to create the effect of lighting. More info
      See in Glossary
      are used instead of any real shadows.
    • Lightprobes, instead of real lights, are used on their characters.
      • Muzzle flashes added into the lightprobe data via script.
    • The only dynamic per-pixel lighting is an arbitrary light direction used to calculate a BRDF on the characters.
  • Bumpmapping - barely used.
    • Real bumpmapping only used on characters.
    • As much contrast and detail as possible is baked into the diffuse texture maps. Lighting information from bumpmaps is baked in.
    • A good example is their statue texture, or their shiny wall, as seen on the right. No bumpmaps are used to render these, the specularity is faked by baking it into the texture. Lightmapping is combined with a vertex-lighting-based specular highlight to give these models a shiny look.
    • If you want to learn how to create textures like this one, check out the Rendering Optimizations page.
  • Dense particles - avoided.
    • UV-scrolling textures used instead of dense particle effects.
  • Fog effects - avoided.
    • Their god rays are hand-modeled.
    • Single planes that fade in and out are used to achieve cinematic fog effects without actually rendering any fog.
      • This is faster because the planes are few and far between, and it means that fog doesn’t have to be calculated on every pixelThe smallest unit in a computer image. Pixel size depends on your screen resolution. Pixel lighting is calculated at every screen pixel. More info
        See in Glossary
        and in every shader.
  • Glow - avoided.
    • Blended sprite planes are used to give the appearance of a glow on certain objects.

Sky Castle Demo

This demo was designed to show what Unity is capable of on high-end Android devices.

  • Dynamic lighting - not used.
    • Lightmaps only.
  • Bumpmapping - used
    • The bricks are all bumpmapped, lit by directional lightmaps. This is where the “high-end devices” part comes into play.
  • Real time reflections - limited.
    • They carefully placed their real-time reflecting surfaces separately and in isolated areas, so that only one runs at a time, and the environment that needs to be rendered twice can be easily culled.

Bottom line - What this means for your game

The more you respect and understand the limitations of the mobile devices, the better your game will look, and the smoother it will perform. If you want to make a high-class game for mobile, you will benefit from understanding Unity’s graphics pipeline and being able to write your own shaders. But if you want something to grab to use right away, ShadowGun’s shaders, available here, are a good place to start.

Don’t Simulate It, Bake It!

There is no question that games attempt to follow the laws of nature. The movement of every parabolic projectile and the color of every pixel of shiny chrome is derived by formulas first written to mimic observations of the real world. But a game is one part scientific simulation and one part painting. You can’t compete in the mobile market with physically accurate rendering; the hardware simply isn’t there yet, if you try to imitate the real world all the way, your game will end up limited, drab, and laggy.

You have to pick up your polygons and your blend modes like they’re paintbrushes.

The baked bumpmaps shown in Shadowgun are great examples of this. There are specular highlights already in the texture - the human eye doesn’t notice that they don’t actually line up with the reflected light and view directions - they are simply high-contrast details on the texture, completely faked, yet they end up looking great. This is a common cheating technique which has been used in many successful games. Compare the visor in the first Halo screenshot ever released with the visor from this release screenshot. It appears that the armor protrusions from the top of the helmet are reflected in the visor, but the reflection is actually baked into the visor texture. In League of Legends, a spell effect appears to have a pixel-light attached to it, but it actually is a blended plane with a texture that was probably generated by taking a screenshot of a pixel light shining on the ground.

What works well:

  • Lightmapped static geometry
    • Dramatic lighting and largely dynamic environments don’t mix. Pick one or the other.
  • Lightprobes for moving objects
    • Current mobile hardware is not really cut out for lots of dynamic lights, and it can’t do shadows. Lightprobes are a really neat solution for complex game worlds with static lighting.
  • Specialized shaders and detailed, high-contrast textures
    • The shaders in ShadowGun minimize per-pixel calculations and exploit complex and high-quality textures. See the Rendering Optimizations page for information on how to make textures that look great even when the shader is simple.
  • Cartoon Graphics
    • Who says your game has to look like a photo? If you make lighting and atmosphere the responsibility of the texture artist, not the engine, you hardly even have to worry about optimizing rendering.

What does not work:

  • Glow and other Post processing effects
    • Approximate such effects when possible by using blended quadsA primitive object that resembles a plane but its edges are only one unit long, it uses only 4 vertices, and the surface is oriented in the XY plane of the local coordinate space. More info
      See in Glossary
      , check out the Shadowgun project for an example of this.
  • Bumpmapping, especially with the built-in shaders
    • Use it sparingly, only on the most important characters or objects. Anything that can take up the whole screen probably shouldn’t be bumpmapped.
    • Instead of using bump mapsAn image texture used to represent geometric detail across the surface of a mesh, for example bumps and grooves. Can be represented as a heightmap or a normal map. More info
      See in Glossary
      , bake more detail and contrast into the diffuse texture. The effect from League of Legends is an interesting example of this being used successfully in the industry.

But how do I actually do it?

See the Rendering Optimizations page.

Practical guide to optimization for mobiles
Scripting and Gameplay Methods