# Scene Setup

WARNING

This guide assumes pre-existing knowledge of AR Foundation and OpenXR. For a more detailed overview, please visit the official AR Foundation (opens new window) and OpenXR (opens new window) Unity documentations.

This is what a proposed sample scene hierarchy looks like in order to run the project with AR Foundation and XR Interaction Toolkit support:

Scene Setup

# AR Foundation

To enable positional head tracking, the following objects are required in the scene:

  • AR Session
  • AR Session Origin
    • AR Camera (Tag this as "MainCamera")

In order to create these game objects manually, right-click in the hierarchy panel and select XR > AR Session or XR >AR Session Origin.

# XR Interaction Manager

At least one XR Interaction Manager is required per scene in order to establish a connection between the Interactors and Interactables.

In order for actions to read to input, they must be enabled first through the Input Action Manager. To add it manually, locate the Input Action Manager script and add the Input Action asset as an element, which is located in the samples path under Shared Assets/Input Actions.

XR Interaction Manager

# Gaze Pointer

The Gaze Pointer is comprised of the following game objects:

Gaze Pointer Hierarchy

The GazeInteractor component is added to the root object. It manages the raycasting and interaction with UI objects in the scene as well as the timer duration and "click" functionality of the pointer.

Gaze Pointer Limitation

Currently the gaze pointer can only interact with UI objects in the scene, unlike the Pointer Controller, which can interact with both UI and 3D objects.

# Controller Manager

The Controller Manager is comprised of the following game objects:

Controller Manager Hierarchy

On the root game object, the XRControllerManager component receives the XRControllerProfile from the SampleController script. In case we are using the Host Controller, the DevicePointer will be activated, containing the HostController mesh and input references. In case we are using a VR device with two controllers, the XRControllers game object will be activated which contains two game objects for the left and right controller with a controller mesh and the specific input references.

The SampleController will send the selected XRControllerProfile to the ControllerManager based on what InputDevice (opens new window) is being used according to the profile that is being used. Please, refer to the Unity documentation for Microsoft Mixed Reality Motion Controller Profile (opens new window) or Oculus Touch Controller Profile (opens new window).

# Input Cheat Sheet

Buttons used for input actions:

Host Controller Right XR Controller Left XR Controller
Select Tap on Trackpad Right Trigger Button Left Trigger Button
Gaze/Pointer switch Menu Button None Left Menu Button
Touchpad Trackpad Right Joystick Left Joystick
Anchor Position Confirmation Tap on Trackpad Any Trigger Button Any Trigger Button

# Editor Camera Controller

While build times can be quite time consuming, the editor camera controller allows for quick testing within the Unity editor itself and provides a key shortcut to switch between the gaze pointer and the controller. The key to switch is output in the Editor console.

Editor Camera Controller script