Camera
This page is a supplement to the AR Foundation Camera manual. The following sections only contain information about APIs where ARKit exhibits unique platform-specific behavior.
Tip
When developing an AR app, refer to both the AR Foundation documentation as well as the required packages for each platform you support.
Optional feature support
ARKit implements the following optional features of AR Foundation's XRCameraSubsystem:
Feature | Descriptor Property | Supported |
---|---|---|
Brightness | supportsAverageBrightness | |
Color temperature | supportsAverageColorTemperature | Yes |
Color correction | supportsColorCorrection | |
Display matrix | supportsDisplayMatrix | Yes |
Projection matrix | supportsProjectionMatrix | Yes |
Timestamp | supportsTimestamp | Yes |
Camera configuration | supportsCameraConfigurations | Yes |
Camera image | supportsCameraImage | Yes |
Average intensity in lumens | supportsAverageIntensityInLumens | Yes |
Focus modes | supportsFocusModes | Yes |
Face tracking ambient intensity light estimation | supportsFaceTrackingAmbientIntensityLightEstimation | Yes |
Face tracking HDR light estimation | supportsFaceTrackingHDRLightEstimation | Yes |
World tracking ambient intensity light estimation | supportsWorldTrackingAmbientIntensityLightEstimation | Yes |
World tracking HDR light estimation | supportsWorldTrackingHDRLightEstimation | |
Camera grain | supportsCameraGrain | iOS 13+ |
Image stabilization | supportsImageStabilization | |
Exif data | supportsExifData | iOS 16+ |
Note
Refer to AR Foundation Camera platform support for more information on the optional features of the camera subsystem.
Light estimation
ARKit light estimation can only be enabled
or disabled
. The availability of either Ambient Intensity
or Environmental HDR
data is governed by the active tracking mode. See the following table for more details.
Tracking configuration | Ambient intensity (lumens) | Color temperature | Main light direction | Main light intensity (lumens) | Ambient spherical harmonics |
---|---|---|---|---|---|
World Tracking | Yes | Yes | No | No | No |
Face Tracking | Yes | Yes | Yes | Yes | Yes |
Camera configuration
XRCameraConfiguration contains an IntPtr
field nativeConfigurationHandle
which is a platform-specific handle. For ARKit, this handle is a pointer to the native ARVideoFormat Objective-C object.
Advanced camera hardware configuration
On supported devices with iOS 16 or newer, you can manually configure advanced camera hardware properties such as exposure. This is useful in situations where you want more control over the camera.
Note
In addition to iOS 16 or newer, advanced camera hardware configuration also requires a device with an ultra-wide camera. Most iOS devices starting with iPhone 11 have an ultra-wide camera. For more device-specific information you can check Apple's Tech Specs.
To configure the camera's advanced hardware properties, you must first lock the camera for configuration using ARKitCameraSubsystem.TryGetLockedCamera. If this method returns true
, you can configure the device's advanced camera hardware properties using the ARKitLockedCamera instance passed as its out argument.
The following code samples demonstrate how to configure advanced camera hardware properties:
Check Support
The following example method checks whether the ARKitCameraSubsystem is available and whether it supports the advanced camera configuration feature. This method is used by the other code examples on this page.
bool AdvancedConfigurationSupported(out ARKitCameraSubsystem subsystem)
{
// check if arkit subsystem is available
subsystem = m_CameraManager.subsystem as ARKitCameraSubsystem;
if (subsystem == null)
{
Debug.LogError("Advanced camera configuration requires ARKit.");
return false;
}
// check whether the device supports advanced camera configuration
if (!subsystem.advancedCameraConfigurationSupported)
{
Debug.LogError("Advanced camera configuration is not supported on this device.");
return false;
}
return true;
}
Exposure
The following example method tries to lock the camera and, if successful, sets the exposure.
void UpdateCameraExposure()
{
if (!AdvancedConfigurationSupported(out ARKitCameraSubsystem subsystem))
return;
// try to get a locked camera
if (!subsystem.TryGetLockedCamera(out var lockedCamera))
{
Debug.LogError("Unable to lock the camera for advanced camera configuration.");
return;
}
// using statement will automatically dispose the locked camera
using (lockedCamera)
{
// set the exposure
const double duration = 0.1f;
const float iso = 500f;
lockedCamera.exposure = new ARKitExposure(duration, iso);
}
}
White Balance
The following example method tries to lock the camera and, if successful, sets the white balance.
void UpdateCameraWhiteBalance()
{
if (!AdvancedConfigurationSupported(out ARKitCameraSubsystem subsystem))
return;
// try to get a locked camera
if (!subsystem.TryGetLockedCamera(out var lockedCamera))
{
Debug.LogError("Unable to lock the camera for advanced camera configuration.");
return;
}
// using statement will automatically dispose the locked camera
using (lockedCamera)
{
// set the white balance
const float blueGain = 2.0f;
const float greenGain = 1.0f;
const float redGain = 1.5f;
lockedCamera.whiteBalance = new ARKitWhiteBalanceGains(blueGain, greenGain, redGain);
}
}
Focus
The following example method tries to lock the camera and, if successful, sets the focus.
void UpdateCameraFocus()
{
if (!AdvancedConfigurationSupported(out ARKitCameraSubsystem subsystem))
return;
// try to get a locked camera
if (!subsystem.TryGetLockedCamera(out var lockedCamera))
{
Debug.LogError("Unable to lock the camera for advanced camera configuration.");
return;
}
// using statement will automatically dispose the locked camera
using (lockedCamera)
{
// set the focus
const float lensPosition = 2.0f;
lockedCamera.focus = new ARKitFocus(lensPosition);
}
}
High resolution CPU image
You can asynchronously capture a high resolution XRCpuImage (or simply, CPU Image) using ARKitCameraSubsystem.TryAcquireHighResolutionCpuImage on iOS 16 and newer.
The example below demonstrates a coroutine to set up and handle the asynchronous request:
IEnumerator CaptureHighResolutionCpuImage()
{
if (m_CameraManager.subsystem is not ARKitCameraSubsystem subsystem)
{
Debug.LogError("High resolution CPU image capture requires ARKit.");
yield break;
}
// Yield return on the promise returned by the ARKitCameraSubsystem
var promise = subsystem.TryAcquireHighResolutionCpuImage();
yield return promise;
// If the promise was not successful, check your Console logs for more
// information about the error.
if (!promise.result.wasSuccessful)
yield break;
// If the promise was successful, handle the result.
DoSomethingWithHighResolutionCpuImage(promise.result.highResolutionCpuImage);
promise.result.highResolutionCpuImage.Dispose();
}
Whenever you successfully acquire a high resolution CPU image, you should Dispose it as soon as possible, as CPU images require native memory resources. If you retain too many high-resolution images, ARKit can be prevented from rendering new frames.
For a complete usage example, see the AR Foundation Samples repository.
Select a resolution
The exact resolution of the high resolution CPU image you receive depends on your camera manager's currentConfiguration. For the highest resolution capture, choose a configuration with a non-binned video format such as 4K resolution (3840x2160).
For more information on binned vs non-binned video formats, see Apple's Discover ARKit 6 video, which explains the ARKit camera architecture in greater detail.
EXIF data
You are able to access camera frame's EXIF data on devices running iOS 16 or newer.
For more information, refer to the EXIF specification.
Apple and ARKit are trademarks of Apple Inc., registered in the U.S. and other countries and regions.