Face tracking
This page supplements the AR Foundation Face tracking manual. The following sections only contain information about APIs where Google's Android XR runtime exhibits platform-specific behavior.
Tip
When developing an AR app, refer to both the AR Foundation documentation as well as the required packages for each platform you support.
Important
You must ensure to configure the appropriate Permissions to use face tracking features on Android XR.
Optional feature support
Android XR implements the following optional features of AR Foundation's XRFaceSubsystem:
Feature | Descriptor Property | Supported |
---|---|---|
Face pose | supportsFacePose | |
Face mesh vertices and indices | supportsFaceMeshVerticesAndIndices | |
Face mesh UVs | supportsFaceMeshUVs | |
Face mesh normals | supportsFaceMeshNormals | |
Eye tracking | supportsEyeTracking | Yes |
Blend Shapes | supportsBlendShapes | Yes |
Note
Refer to AR Foundation Face tracking platform support for more information on the optional features of the Face subsystem.
Face data
This platform exposes face data for the active user (the person wearing the headset). Currently, gaze and blend shape data is surfaced.
Gaze data is available within the ARFace class, while blend shape data needs to be queried for separately using the TryGetBlendShapes API.
New blend shape data might be available when the ARFace
updated event is triggered.
Tip
To ensure you keep your mesh renders up to date, update your mesh renderer each time the ARFace
updated event is triggered.
You can use the TryGetInwardRegionConfidences API to determine the accuracy of blend shape data per face region. A confidence value of at least 0.3
indicates acceptable blend shape data. To learn how Android XR groups blend shape locations into confidence regions, consult the Android XR Face Tracking documentation.
To know which face is the inward avatar-eyes gaze object, cast an XRFaceSubsystem
to an AndroidOpenXRFaceSubsystem
and read its inwardID
property.
Permissions
AR Foundation's face tracking feature requires two Android system permissions on the Android XR runtime. Your user must grant your app the android.permission.EYE_TRACKING_COARSE
and android.permission.FACE_TRACKING
permissions before it can track face data.
To avoid permission-related errors at runtime, set up your scene with the AR Face Manager component disabled, then enable it only after the required permission is granted.
The following example code demonstrates how to handle required permissions and enable the AR Face Manager component:
const string k_EyeTrackingPermission = "android.permission.EYE_TRACKING_COARSE";
const string k_FaceTrackingPermission = "android.permission.FACE_TRACKING";
[SerializeField]
ARFaceManager m_ARFaceManager;
#if UNITY_ANDROID
void Start()
{
if (!Permission.HasUserAuthorizedPermission(k_EyeTrackingPermission) || !Permission.HasUserAuthorizedPermission(k_FaceTrackingPermission))
{
var callbacks = new PermissionCallbacks();
callbacks.PermissionDenied += OnPermissionDenied;
callbacks.PermissionGranted += OnPermissionGranted;
Permission.RequestUserPermissions(new string[] { k_EyeTrackingPermission, k_FaceTrackingPermission }, callbacks);
}
else
{
// enable the AR Face Manager component if permission is already granted
m_ARFaceManager.enabled = true;
}
}
void OnPermissionDenied(string permission)
{
// handle denied permission
}
void OnPermissionGranted(string permission)
{
// enable the AR Face Manager component after permission is granted
m_ARFaceManager.enabled = true;
}
#endif // UNITY_ANDROID
For a code sample that involves multiple permissions in a single request, refer to the Permissions page.