docs.unity3d.com
Search Results for

    Show / Hide Table of Contents

    Face tracking samples

    Face tracking samples demonstrate AR Foundation Face tracking features. You can open these samples in Unity from the Assets/Scenes/FaceTracking folder.

    Tip

    You can check which platforms support AR Foundation Face tracking features by checking the Face tracking platform support page.

    To understand each of the face tracking sample scenes, refer to the following sections:

    Sample Description
    Face pose Draws an axis at the detected face's pose.
    Face mesh Instantiates and updates a mesh representing the detected face.
    Face regions (ARCore) Demonstrates ARCore face regions.
    Blend shapes (ARKit) Implements ARKit blend shapes.
    Eye lasers, eye poses, and fixation point (ARKit) Demonstrate eye and fixation point tracking on ARKit.
    Rear camera (ARKit) Use face tracking while the world-facing (rear) camera is active.

    ARKit requirements

    Face tracking supports devices with Apple Neural Engine in iOS 14 and newer, and iPadOS 14 and newer. Devices with iOS 13 and earlier, and iPadOS 13 and earlier, require a TrueDepth camera for face tracking. Refer to Apple's Tracking and Visualizing Faces documentation for more information.

    Note

    To check whether a device has an Apple Neural Engine or a TrueDepth camera, refer to Apple's Tech Specs.

    Face pose scene

    The Face Pose scene is the simplest face tracking sample. This sample draws an axis at the detected face's pose.

    This sample uses the front-facing (selfie) camera.

    Face mesh scene

    The Face Mesh scene instantiates and updates a mesh representing the detected face.

    Information about the device support (for example the number of faces that can be simultaneously tracked) is displayed on the screen.

    This sample uses the front-facing (selfie) camera.

    Face regions scene (ARCore)

    The AR Core Face Regions scene demonstrates the ARCore face regions feature.

    Face regions are an ARCore-specific feature which provides pose information for specific regions on the detected face. For example, the left eyebrow. To understand more about face regions, refer to the ARCore Face tracking documentation.

    The face regions sample draws axes at each face region. Refer to ARCoreFaceRegionManager.cs to learn more about the sample code.

    Face regions requirements

    This sample is available on ARCore only and uses the front-facing (selfie) camera.

    Blend shapes scene (ARKit)

    The ARKit Face Blend Shapes scene demonstrates Apple's Blend shapes (Apple developer documentation) feature.

    Blend shapes are an ARKit-specific feature which provides information about various facial features on a scale of 0..1. For instance wink and frown are two blend shapes.

    In the blend shapes sample, blend shapes are used to puppet a cartoon face which is displayed over the detected face. Refer to ARKitBlendShapeVisualizer.cs to understand the sample code.

    Blend shape requirements

    This sample is available on ARKit only and uses the front-facing (selfie) camera.

    Eye lasers, eye poses, and fixation point scenes (ARKit)

    The Eye Lasers, Eye Poses, and Fixation Point scenes demonstrate eye and fixation point tracking.

    Eye tracking produces a pose (position and rotation) for each eye in the detected face. The fixation point is the point the face is looking at (fixated upon). The eye lasers sample uses the eye pose to draw laser beams emitted from the detected face.

    Requirements

    These samples use the front-facing (selfie) camera and require an iOS device with a TrueDepth camera.

    Rear camera face tracking scene (ARKit)

    The World Camera With User Facing Face Tracking scene implements ARKit support to use face tracking while the world-facing camera is active. You can open this sample in Unity from the Assets/Scenes/FaceTracking/WorldCameraWithUserFacingFaceTracking folder.

    iOS 13 adds support for face tracking while the world-facing (rear) camera is active. This means the user-facing (selfie) camera is used for face tracking, but the passthrough video uses the world-facing camera.

    To enable this mode in AR Foundation, you must:

    • Enable an ARFaceManager.
    • Set the ARSession Tracking mode to Position and Rotation or Don't Care.
    • Set the ARCameraManager's Facing direction to World. Tap the screen to toggle between the user-facing and world-facing cameras.

    The sample code in DisplayFaceInfo.OnEnable.cs shows how to detect support for these face tracking features.

    When using the world-facing camera, a cube is displayed in front of the camera. The orientation of the cube is driven by the face in front of the user-facing camera.


    Apple and ARKit are trademarks of Apple Inc., registered in the U.S. and other countries and regions.

    In This Article
    Back to top
    Copyright © 2025 Unity Technologies — Trademarks and terms of use
    • Legal
    • Privacy Policy
    • Cookie Policy
    • Do Not Sell or Share My Personal Information
    • Your Privacy Choices (Cookie Settings)