# Scene Setup

WARNING

This guide assumes pre-existing knowledge of AR Foundation and OpenXR. For a more detailed overview, please visit the official AR Foundation (opens new window) and OpenXR (opens new window) Unity documentations.

This is what a proposed sample scene hierarchy looks like in order to run the project with AR Foundation and XR Interaction Toolkit support:

Scene Setup

# AR Foundation

To enable positional head tracking, the following objects are required in the scene:

  • AR Session
  • AR Session Origin
    • AR Camera

In order to create these game objects manually, right-click in the hierarchy panel and select XR > AR Session or XR >AR Session Origin.

# XR Interaction Manager

At least one XR Interaction Manager is required per scene in order to establish a connection between the Interactors and Interactables.

In order for actions to read to input, they must be enabled first through the Input Action Manager. To add it manually, locate the Input Action Manager script and add the Input Action asset as an element, which is located in the samples path under Shared Assets/Input Actions.

XR Interaction Manager

# Gaze Pointer

The Gaze Pointer is comprised of the following game objects:

Gaze Pointer Hierarchy

On the root game object, the GazeInteractor.cs scripts is located. It manages the raycasting and interaction with UI objects in the scene as well as the timer duration and "click" functionality of the pointer.

Gaze Pointer Limitation

Currently the gaze pointer can only interact with UI objects in the scene, unlike the Pointer Controller, which can interact with both UI and 3D objects.

# Editor Camera Controller

While build times can be quite time consuming, the editor camera controller allows for quick testing within the Unity editor itself and provides a key shortcut to switch between the gaze pointer and the controller. The key to switch is output in the Editor console.

Editor Camera Controller script