# Hand Tracking Sample

This sample demonstrates how to enable the Hand Tracking feature and use returned data of the SpacesHandTracking extension. In order to use this feature it has to be enabled in the OpenXR plugin settings located under Project Settings > Snapdragon Spaces plugin.

WARNING

Make sure the OpenXRHandTracking plugin is disabled so that the standard Hand Tracking does not override the Spaces Hand Tracking.

Disable OpenXRHandTrackingPlugin

# How the sample works

When the sample is opened, the Hand Tracking starts when the user puts the hands in the field of view of the headset cameras. By default the user will see the meshes representing each one of the hands, in case hand meshing is deactivated, spheres will be placed on each joint of the hands. You will also see the representation of your reflection and that of your hands in the scene mirror. In addition, you will be able to see what gestures you are making with each hand.

The following image shows the hand mesh and the hand joints represented by spheres.

Hand mesh and joints

# Hand Tracking Manager

The sample uses the BP_HandTrackingManager blueprint asset (located under Content > SnapdragonSpaces > Samples > HandTracking > Placeable), which centralizes the actions of retrieving and rendering the hands.

To enable and disable Hand Tracking, the Set Hand Tracking State method must be used. We call it in the Event Begin Play of the hand tracking manager to activate it, and in the Event End Play to deactivate it. This is done to prevent hand tracking from running in the background of the application.

To enable and disable Hand Meshing we use Set Hand Mesh Status, we can also know the status of this feature with Get Hand Mesh Status, this is used to manage the hand meshes visibility.

The two hands are represented as Motion Controllers and the data can be obtained via Get Motion Controller Data. After getting the Motion Controller Data, we proceed to check that the hands are visible and then render the virtual representation of the hands, this can be done by using hand meshes or hand joints.

If Hand Meshing is disabled, Draw Joints is used to display the hand joints. We extract the corresponding information from Motion Controller Data, specifically the location and rotation of each hand joint, to correctly place the representation of each joint. For this representation, we use the actor BP_HandJointRepresentation.

Hand tracking manager

To render the hand meshes, the method Render Hand Mesh from the HGestures Blueprint Library is used, where we decide which hand to render and which material to use, we also have to pass the reference of the actor that represents the hand mesh and another variable representing the original number of vertices.

WARNING

In Blueprint nodes, Unreal Engine displays the parameters by reference as return values. It is mandatory to pass the reference of the actor that represents the hand mesh and the variable representing the number of vertices as well, otherwise multiple actors will be created.

Hand tracking manager

TIP

To test very simply, the hand rendering can be done using the XRVisualization plugin and connecting the Motion Controller data to the Render Motion Controller function.

# Mirror and Mirror Pawn

BP_Mirror and BP_Mirror_Pawn actors (located under Content > SnapdragonSpaces > Samples > HandTracking > Placeable) are used to create a virtual reflection representing the position and rotation of the player and the player's hands. The Mirror Pawn actor is in charge of positioning the mirrored representation of the player's body and hand joints, as can be seen in the following image.

Mirror

# Gesture Status

The user interface of the sample, WBP_HandTrackingSample (located under Content > SnapdragonSpaces > Samples > HandTracking > UI), is used to switch between Hand Meshing and Hand Joints visualization and also to visualize the hand gestures recognized in each frame. For this we use the functions of the HGestures Blueprint Library class. In each frame we check if the two hands are being tracked and what gestures are being made, for this we use the methods Get XRHand Gesture Data and Is Hand Tracked.

Getting hand gestures

The gesture data is composed of the following parameters:

  • Type: Enum value that gives which gesture was detected for the hand. It could be one of the gestures among this list : { UNKNOWN, OPEN_HAND, GRAB, PINCH, POINT, VICTORY, METAL, ERROR }
  • GestureRatio: Float value between 0 and 1, indicating how much the gesture is applied.
  • FlipRatio: Float value between -1 and 1, indicating if the hand gesture is detected from the back (-1), from the front (1) or in between.

Here is the result with the UI showing the gesture type and ratios detected for the left hand.

Pinch gesture