# Controller blueprints
The implementation for the different controllers is done in components for easy extension or replacement in the actors.
The parent component class
BP_ControllerComponent (located under
Content > SnapdragonSpaces > Common > Core > Components) holds access to the gaze and pointer controller. This blueprint also establishes the basis for managing the interaction and includes some options that are customizable for the developer in all derived controllers.
- Auto Activate: Indicates if the controller is enabled by default.
- Delay Start: Time that should elapse until interaction should be enabled when the app starts or loads a new map.
- Tag Component: Name of the tag on the parent component (from pawn character) for later references from the controller component implementation.
Some functions to consider in order to interact with the controller components or create a custom child controller component are:
- Start: This event should be called only the first time to initialize the controller, and doesn't support overwrite. Preferably from
- Start_Implementation: This function contains the particular initialization of each child component.
- Set Default Controller: Configures whether the component should be activated after initialization. For this reason, is only useful if the call is done before the
- Enable: Enables the component.
- Disable: Disables the component.
- Is Enable: Returns if the component is enabled.
- Press/Release Button: Manages the button interaction using the
Widget Interaction Component.
- Is Over Interactable Widget: Returns if the
Widget Interaction Componentis pointing a widget.
The key element of the controllers for an XR project is the
Widget Interaction Component. All components or derivatives should only establish the rules to interact depending on the type of the controller. For basic information about this component, please refer to the Unreal documentation (opens new window).
# Gaze Controller
The gaze controller
BP_GazeControllerComponent (located under
Content > SnapdragonSpaces > Common > Core > Components), manages the raycasting and interactions with widget actors in the scene as well as the delay interaction functionality and "click" functionality of the pointer.
Gaze Pointer Limitation
Currently, the gaze pointer can only interact with widget actors in the scene, unlike the Pointer Controller, which can interact with both widgets and 3D actors.
Some options are customizable for the developer as:
- Move Reticle Hit: If this boolean is enabled and the user looks at an interactive actor, the reticle moves to the interactive position of the actor.
- Timer Duration: How many seconds the user has to look at something to select it.
- Default Distance: When bMoveReticleToHit is active, this sets the distance at which the gaze controller is drawn.
- Vertical Bias: The vertical position of the reticle considering 0 the center of the screen, 1 the top of the screen, and -1 the bottom of the screen.
- Reticle Outer Ring: Material for the outer ring reticle, if it is
NULLassumes that the outer ring is disabled.
Finally, the reticle, which the samples use to point to the world, is composed of two planes as a Static Mesh Component: one of them with a static material for the inner ring (
MI_Reticle_Inner) and the other uses a dynamic material for the outer ring (
MI_Reticle_Outer). This dynamic material defines a parameter (Percentage) to complete the visual effect of the reticle interaction. In addition, the function
Update Reticle Position (in the component blueprint) uses the outer texture as the parent of the components related to the reticle, and uses it to move them in the interaction hits. The component distinguishes each texture using the Component Tags (gaze, gaze_outerring, gaze_parent).
# Pointer Controller
The main behavior of the pointer controller is implemented in
BP_PointerControllerComponent (located under
Content > SnapdragonSpaces > Common > Core > Components). The basis of how it works is common to the other controllers, but the initialization also manages whether if it is being used on a device that supports 3DoF or 6DoF. Controllers using the microsoft interaction profile will be automatically enabled as 3DoF controllers and controllers using the oculus interaction profile will be automatically enabled as 6DoF in the
Motion Controller Component.
Please, note that the hand selected must be Left when using 3DoF. For 6DoF, the hand selected could be Left or Right depending on which hand the controller is designed for. In addition, the developer can also select which controller handles the interaction.
The Spawn Controller function, spawns a child of the
BP_PointerController class actor set on the Motion Controller Class variable.
BP_PointerController actor (located under
Content > SnapdragonSpaces > Common > Placeable) is ready to manage each individual controller.
Motion Controller Component is added as a child of the root component to manage each controller. The
Visualization section of this component allows rendering a 3D model in the virtual world following the controller movement in the real world, just enabling the Display Device Model option. In addition, if the developer is interested in showing the default mesh of each device depending on the profile, has to select OpenXR in the Display Model Source option. Otherwise, the developer can select Custom to customize it, and add the mesh in Custom Display Mesh. Finally, in the
Motion Controller section, the developer should assign the type of input.
Only if the interaction is enabled, the
Motion Controller Component must have attached: the
Widget Interaction Component to properly track the real world movement of the controller, and a
Static Mesh Component, called Laser Pointer Mesh in the sample, to visualize the pointer. Independent of interaction, if a 6DoF device is being used instead, the parent component needs to attach a
Child Actor Component using the
BP_XRControllerRepresentation (located under
Content > SnapdragonSpaces > Common > Placeable) to display the controller and its animations.
# Hand Tracking Controller
The hand tracking controller
BP_HandTrackingControllerComponent (located under
Content > SnapdragonSpaces > Common > Core > Components) is in charge of spawning and enabling the actors needed to use hand tracking distal interaction. Those actors will automatically handle the interaction. This controller will only be active if the Hand Tracking feature, located under
Project Settings > Snapdragon Spaces plugin, is active. The controller will allow interaction for both widget actors in the scene and actors that can be interacted with hand tracking. Find more information about how the hand tracking interaction works in the Hand Tracking Sample.
Hand Tracking Controller Limitation
Currently, the hand tracking interaction is not available to use in the Unreal Engine editor.
# Input Cheat Sheet
Buttons used for input actions:
|Host Controller||Right XR Controller||Left XR Controller|
|Select||Tap on Trackpad||Right Trigger Button||Left Trigger Button|
|Gaze/Pointer switch||Menu Button||None||Left Menu Button|
|Anchor Position Confirmation||Tap on Trackpad||Any Trigger Button||Any Trigger Button|
The 3D Widgets located in the world must be created using the blueprint
BP_3DWidget (located under
Content > Snapdragon > Common > UI) and define the UI in the WidgetComponent. In order to work with the gaze or pointer controller, it must be a child class of this class.