Version: Unity 6.3 LTS (6000.3)
Language : English
Analyze mipmap streaming
Streaming Virtual Texturing

Sparse Textures

Sparse textures (also known as “tiled textures” or “mega-textures”) are textures that are too large to fit in graphic memory in their entirety. To handle them, Unity breaks the main texture down into smaller rectangular sections known as “tiles”. Individual tiles can then be loaded as necessary. For example, if the cameraA component which creates an image of a particular viewpoint in your scene. The output is either drawn to the screen or captured as a texture. More info
See in Glossary
can only see a small area of a sparse texture, then only the tiles that are currently visible need to be in memory.

Aside from the tiling, a sparse texture behaves like any other texture in usage. ShadersA program that runs on the GPU. More info
See in Glossary
can use them without special modification and they can have mipmaps, use all texture filtering modes, etc. If a particular tile cannot be loaded for some reason then the result is undefined; some GPUs show a black area where the missing tile should be but this behaviour is not standardised.

Not all hardware and platforms support sparse textures. For example, on DirectX systems they require DX11.2 (Windows 8.1) and a fairly recent GPU. On OpenGL they require ARB_sparse_texture extension support. Sparse textures only support non-compressed texture formatsA file format for handling textures during real-time rendering by 3D graphics hardware, such as a graphics card or mobile device. More info
See in Glossary
.

See the SparseTexture script reference page for further details about handling sparse textures from scriptsA piece of code that allows you to create your own Components, trigger game events, modify Component properties over time and respond to user input in any way you like. More info
See in Glossary
.

Example

The following example loads only the texture for the tile directly under the camera. The example only works if your hardware and platform support sparse textures. To use the example:

  1. Create a sceneA Scene contains the environments and menus of your game. Think of each unique Scene file as a unique level. In each Scene, you place your environments, obstacles, and decorations, essentially designing and building your game in pieces. More info
    See in Glossary
    with a large plane GameObjectThe fundamental object in Unity scenes, which can represent characters, props, scenery, cameras, waypoints, and more. A GameObject’s functionality is defined by the Components attached to it. More info
    See in Glossary
    . For more information, refer to Primitive and placeholder objects.
  2. Attach the script to the GameObject.
  3. Enter Play mode.
  4. Change the x and y position of the camera. The Scene viewAn interactive view into the world you are creating. You use the Scene View to select and position scenery, characters, cameras, lights, and all other types of Game Object. More info
    See in Glossary
    shows the tiles loading and unloading.
using UnityEngine;
using System.Collections;
using System.Collections.Generic;

public class SparseTextureExample : MonoBehaviour {

    private int tileCountX;
    private int tileCountY;
    private Bounds bounds;
    private SparseTexture texture;
    private Color32[] colorForSingleTile;
    private Vector2 lastTileVisible;

    void Awake()
    {
        if (!SystemInfo.supportsSparseTextures)
        {
            Debug.Log("Sparse textures are not supported on this platform.");
        }

        bounds = GetComponent<Renderer>().bounds;
        lastTileVisible = new Vector2(0f, 0f);

        // Create the sparse texture
        DestroyImmediate(texture);
        texture = new SparseTexture(1024, 1024, TextureFormat.RGBA32, 1, false);

        // Set the texture on the material
        GetComponent<Renderer>().sharedMaterial.mainTexture = texture;

        // Get the number of tiles from the sparse texture
        tileCountX = (texture.width + texture.tileWidth - 1) / texture.tileWidth;
        tileCountY = (texture.height + texture.tileHeight - 1) / texture.tileHeight;
        Debug.Log("Tile size: " + texture.tileWidth + " x " + texture.tileHeight + " pixels");
        Debug.Log("Tile count: " + tileCountX + " x " + tileCountY);

        // Fill the color array for a single tile with red
        colorForSingleTile = new Color32[texture.tileWidth * texture.tileHeight];
        for(int i = 0; i < colorForSingleTile.Length; i++)
        {
            colorForSingleTile[i] = new Color32(255, 0, 0, 255);
        }
    }

    void LateUpdate()
    {
        // Get camera position
        Vector3 camPos = Camera.main.transform.position;

        // Get tile index x and y under the camera position
        float px = Mathf.Clamp(Mathf.InverseLerp (bounds.min.x, bounds.max.x, camPos.x), 0.0f, 0.9999f);
        float py = Mathf.Clamp(Mathf.InverseLerp (bounds.min.z, bounds.max.z, camPos.z), 0.0f, 0.9999f);
        int tileX = (int)(px * 1024 / texture.tileWidth);
        int tileY = (int)(py * 1024 / texture.tileHeight);

        // Update the tile with the red color
        // In a real project, you might load the texture from disk or generate it procedurally
        texture.UpdateTile(tileX, tileY, 0, colorForSingleTile);

        // Use the SparseTexture API to unload the texture of the previous tile under the camera
        Vector2 tileVisible = new Vector2(tileX, tileY);
        if(lastTileVisible != tileVisible)
        {           
            Debug.Log("Tile under camera: " + tileX + ", " + tileY);

            texture.UnloadTile((int)lastTileVisible.x, (int)lastTileVisible.y, 0);
            Debug.Log("Unloaded: " + (int)lastTileVisible.x + ", " + (int)lastTileVisible.y);

            lastTileVisible = tileVisible;
        }
    }

    void OnDisable ()
    {
        GetComponent<Renderer>().sharedMaterial.mainTexture = null;
        DestroyImmediate(texture);
    }
}
Analyze mipmap streaming
Streaming Virtual Texturing