# Interaction
# XR Interaction Manager
At least one XR Interaction Manager is required per scene in order to establish a connection between the Interactors and Interactables.
Input Actions must be enabled through the Input Action Manager. To add them manually, locate the Input Action Manager script and add an Input Action asset as an element. These assets are located in the samples path under Shared Assets/Input Actions.

# Gaze Pointer
The Gaze Pointer is comprised of the following game objects:

The GazeInteractor component is added to the root object. It manages the raycasting and interaction with UI objects in the scene as well as the timer duration and "click" functionality of the pointer.
Gaze Pointer Limitation
Currently the gaze pointer can only interact with UI objects in the scene, unlike the Pointer Controller, which can interact with both UI and 3D objects.
# Controller Manager
The Controller Manager
is comprised of the following game objects:

On the root game object, the XRControllerManager component receives the XRControllerProfile
from the SampleController script. If the Host Controller
is being used, the DevicePointer
will be activated. The DevicePointer
contains the HostController
mesh and input references. If a VR device with two controllers is being used, the XRControllers
game object will be activated. The XRControllers
contains two game objects for the left and right controller with an Action Based Controller (opens new window) component with references to each controller prefab and the specific input references.
The SampleController will send the selected XRControllerProfile
to the ControllerManager based on what InputDevice (opens new window) is being used. Please, refer to the Unity documentation for Microsoft Mixed Reality Motion Controller Profile (opens new window) or Oculus Touch Controller Profile (opens new window).
# Input Cheat Sheet
Buttons used for input actions:
Host Controller | XR Controller Right | XR Controller Left | |
---|---|---|---|
Select | Tap on Trackpad | Right Trigger Button | Left Trigger Button |
Gaze/Pointer switch | Menu Button | None | Left Menu Button |
Touchpad | Trackpad | Right Joystick | Left Joystick |
Anchor Position Confirmation | Tap on Trackpad | Any Trigger Button | Any Trigger Button |
# Controller Haptics
Controller haptics are sent via the SampleController's SendHapticImpulse (opens new window) function to the XRControllerManager. When SendHapticImpulse
is called, a haptic impulse is triggered on both the host controller
and XR Controllers
on UI button press or when scrolling.
WARNING
Haptic feedback is felt on both XR controllers at the moment, regardless of which one triggers an action.
# Controller Animations for XR Controllers
Each XR Controller in XRControllers
has a reference to their XR Controller prefab containing the controller mesh with the buttons' blend shapes.
Each prefab has an XRControllerInputAnimation.cs attached. This script will update the blend shape weight (opens new window) values with the values received of each button of the controller, creating the button animations.
# Editor Camera Controller
While build times can be quite time consuming, the editor camera controller allows for quick testing within the Unity editor itself and provides a key shortcut to switch between the gaze pointer and the controller. The key to switch is output in the Editor console.
