Skip to main content
Version: 1.0

Interaction

Gaze Controller

Extending the user's view direction acts as a simple way to trigger input.

A typical approach for implementing a gaze pointer might attach the pointer to the camera so that it is always rendered at a fixed distance in front of the user. This may be complicated by the rendering order of other objects or effects in the world. Sometimes a gaze pointer can lag behind a user's movements slightly, and it may also result in visuals that are slightly blurred around the edges.

By using Composition Layers it is possible to render the gaze pointer with greater stability and visual fidelity, at the cost of some performance. The pointer can be simply rendered after everything else in the world. The pointer is then composited into the final rendered image, just before the image is submitted to the screen so that the pointer has crisp, clear edges and the image remains stable.

Companion Controller

FOR AR DEVICES ONLY

The Companion Controller (a.k.a. Host Controller) is a software controller that takes use of the smartphone as a 3DoF controller for input. The controller includes onscreen input that can be assigned to actions. The re-center button however, is not reporting anything to the engine side and just resets the pose delivered to the application.

XR Controllers

FOR VR/MR DEVICES ONLY

The Snapdragon Spaces plugins have a set of generic XR controllers included that will be used to visualize connected devices for VR headsets. The selection between to visualize either the XR controllers or the device pointer is steered by a managing component on both engine sides.

In Unity the switch is managed by the Controller Manager component and in Unreal Engine it is done in the Pointer Controller blueprint.

In Unity's case, the controller implementation uses blend shapes to provide accurate visual feedback on the state of each button. Haptic feedback is also supported in Unity.

Hand Tracking

Hand tracking offers more natural forward-facing input options by touching or grabbing UI elements directly with or without virtual force feedback, managing a ray pointer for interfacing with UI at a distance, or responding to specific gesture commands. Right now custom defined hand gestures are not supported.

For more details on hand tracking including supported gestures and best practices, please refer to the Hand Tracking Design & User Experience.

Please refer to the sample documentation on how to integrate Hand Tracking: