# Camera Frame Access Sample

WARNING

The Camera Frame Access feature is marked as experimental because it is not supporting all public AR Foundation APIs fully and optimizations on the package and Snapdragon Spaces Service side are breaking backwards compatibility from release to release at the moment.

This sample demonstrates how to retrieve RGB camera frames and intrinsic properties for image processing. For basic information about camera frame access and what AR Foundation's AR Camera Manager component does, please refer to the Unity documentation (opens new window). In order to use this feature it has to be enabled in the OpenXR plugin settings located under Project Settings > XR Plug-in Management > OpenXR (> Android Tab). Additionally, this sample requires allowing 'unsafe' code in the Android Player settings located under Project Settings > Player > Android > Script Compilation.

# How the sample works

Adding the AR Camera Manager component to the AR Session Origin > AR Camera GameObject will enable the camera access subsystem. Upon starting, the subsystem will retrieve valid sensor configurations from the viewer device. If a valid YUV420 sensor configuration is found, the subsystem will select this configuration as the provider for CPU camera images. Conceptually, an AR Camera Manager represents a single camera and will not manage multiple sensors at the same time.

The sample scene consists of two panels:

  • A camera feed panel displaying the latest CPU image from the device camera, with Pause and Resume buttons
  • A camera info panel enumerating the various properties of the device camera
Camera Sample GUI

# Retrieving CPU images

The camera manager's TryAcquireLatestCpuImage function will return an XRCpuImage object which represents a single, raw image from the selected device camera. The raw pixel data of this image can be extracted with XRCpuImage's Convert function, which returns a NativeArray<byte>.

IMPORTANT

XRCpuImage objects must be explicitly disposed of after conversion. To do this, use XRCpuImage's Dispose function. Failing to dispose of XRCpuImage objects will leak memory until the camera access subsystem is destroyed.

If allocating a NativeArray<byte> before conversion, this buffer must also be disposed of after copy or manipulation. To do this, use NativeArray<T>'s Dispose function. Failing to dispose of NativeArray<byte> will leak memory until the camera access subsystem is destroyed.

For detailed information about how to use TryAcquireLatestCpuImage and XRCpuImage, please refer to the Unity documentation (opens new window).

The sample code below first requests a CPU image from the AR Camera Manager. If successful, it extracts the XRCpuImage's raw pixel data directly into a managed Texture2D's GetRawTextureData<byte> buffer, applying the texture buffer afterwards with the Apply function. Finally, it updates the texture in the target RawImage, making the new frame visible in the application's UI.

public RawImage CameraRawImage;

private ARCameraManager _cameraManager;
private Texture2D _cameraTexture;
private XRCpuImage _lastCpuImage;

public void UpdateCpuImage() {
    _lastCpuImage = new XRCpuImage();
    if (!_cameraManager.TryAcquireLatestCpuImage(out _lastCpuImage)) {
        return;
    }

    UpdateCameraTexture(_lastCpuImage);
}

private unsafe void UpdateCameraTexture(XRCpuImage image) {
    var format = TextureFormat.RGBA32;

    if (_cameraTexture == null || _cameraTexture.width != image.width || _cameraTexture.height != image.height) {
        _cameraTexture = new Texture2D(image.width, image.height, format, false);
    }

    var conversionParams = new XRCpuImage.ConversionParams(image, format);
    var rawTextureData = _cameraTexture.GetRawTextureData<byte>();

    try {
        image.Convert(conversionParams, new IntPtr(rawTextureData.GetUnsafePtr()), rawTextureData.Length);
    }
    finally {
        image.Dispose();
    }

    _cameraTexture.Apply();
    CameraRawImage.texture = _cameraTexture;
}

The following texture formats are supported by the AR Camera Manager:

  • RGB24
  • RGBA32
  • BGRA32

# Retrieving sensor intrinsics

The camera manager's TryGetIntrinsics function will return an XRCameraIntrinsics object which describes the physical characteristics of the selected sensor. For detailed information about XRCameraIntrinsics, please refer to the Unity documentation (opens new window).

The sample code below retrieves the intrinsics of the selected sensor and displays it in the application UI.

public Text[] ResolutionTexts;
public Text[] FocalLengthTexts;
public Text[] PrincipalPointTexts;

private ARCameraManager _cameraManager;
private XRCameraIntrinsics _intrinsics;

private void UpdateCameraIntrinsics() {
    if (!_cameraManager.TryGetIntrinsics(out _intrinsics)) {
        Debug.Log("Failed to acquire camera intrinsics.");
        return;
    }

    ResolutionTexts[0].text = _intrinsics.resolution.x.ToString();
    ResolutionTexts[1].text = _intrinsics.resolution.y.ToString();
    FocalLengthTexts[0].text = _intrinsics.focalLength.x.ToString("#0.00");
    FocalLengthTexts[1].text = _intrinsics.focalLength.y.ToString("#0.00");
    PrincipalPointTexts[0].text = _intrinsics.principalPoint.x.ToString("#0.00");
    PrincipalPointTexts[1].text = _intrinsics.principalPoint.y.ToString("#0.00");
}