docs.unity3d.com
Search Results for

    Show / Hide Table of Contents

    Face tracking

    This page is a supplement to the AR Foundation Face tracking manual. The following sections only contain information about APIs where ARKit exhibits unique platform-specific behavior.

    Important

    To use face tracking with ARKit, you must first enable face tracking in the XR Plug-in Management settings. Refer to Enable the Face tracking subsystem to understand how to enable face tracking for ARKit.

    Optional feature support

    ARKit implements the following optional features of AR Foundation's XRFaceSubsystem. The availability of features on specific devices depends on device hardware and software. Refer to Requirements for more information.

    Feature Descriptor Property Supported
    Face pose supportsFacePose Yes
    Face mesh vertices and indices supportsFaceMeshVerticesAndIndices Yes
    Face mesh UVs supportsFaceMeshUVs Yes
    Face mesh normals supportsFaceMeshNormals
    Eye tracking supportsEyeTracking Yes
    Note

    Refer to AR Foundation Face tracking platform support for more information on the optional features of the face subsystem.

    Session configuration

    Face tracking requires the use of the user-facing or "selfie" camera. It is the responsibility of your session's XRSessionSubsystem.configurationChooser to choose the camera facing direction. You can override the configuration chooser to meet your app's needs. For more information, refer to Configuration Chooser.

    You can access a sample that shows how to use the ConfigurationChooser to choose between the user-facing and world-facing camera on the AR Foundation samples GitHub repository.

    Configuration chooser

    iOS devices support different combinations of features in different camera facing directions. If your scene contains several manager components that require the world-facing camera, AR Foundation's default configuration chooser might decide to use the world-facing camera, even if the AR Face Manager component is also enabled in your scene. You can create your own ConfigurationChooser to prioritize face tracking functionality over other features if you desire greater control over the camera's facing direction.

    You can access an example of using a custom ConfigurationChooser in the Rear Camera (ARKit) sample on the AR Foundations samples GitHub. This example demonstrates how you can use the user-facing camera for face tracking, and the world-facing (rear) camera for passthrough video (iOS 13+).

    Blend shapes

    ARKit provides a series of blend shapes to describe different features of a face. Each blend shape is modulated from 0..1. For example, one blend shape defines how open the mouth is.

    A blend shape represents action at a location on a face. Each blend shape is defined by an ARKitBlendShapeLocation to identify the location of the face action and a ARKitBlendShapeCoefficient to describe the amount of action at the location. The ARKitBlendShapeCoefficient is a value between 0.0 and 1.0.

    You can learn more about blend shapes with the Blend shapes sample on the AR Foundation Samples GitHub. This sample uses blend shapes to puppet a cartoon face which is displayed over the detected face.

    Face visualizer samples

    The AR Foundation Samples GitHub repository contains ARKit-specific prefabs that you can use to visualize faces in your scene, as outlined in the following table. Refer to the AR Foundation AR Face manual for more information on how to use these prefabs.

    Prefab Description
    AR Eye Pose Visualizer Visualize the location and direction of the eyes of a detected face.
    Eye Laser Visualizer Use the eye pose to draw laser beams emitted from the detected face.
    Sloth Head Use the face blend shapes provided by ARKit to animate a 3D character.

    Requirements

    Face tracking supports devices with Apple Neural Engine in iOS 14 and newer, and iPadOS 14 and newer. Devices with iOS 13 and earlier, and iPadOS 13 and earlier, require a TrueDepth camera for face tracking. Refer to Apple's Tracking and Visualizing Faces documentation for more information.

    Note

    To check whether a device has an Apple Neural Engine or a TrueDepth camera, refer to Apple's Tech Specs.


    Apple and ARKit are trademarks of Apple Inc., registered in the U.S. and other countries and regions.

    In This Article
    Back to top
    Copyright © 2025 Unity Technologies — Trademarks and terms of use
    • Legal
    • Privacy Policy
    • Cookie Policy
    • Do Not Sell or Share My Personal Information
    • Your Privacy Choices (Cookie Settings)