# Extended Hand Tracking Sample

This sample scene demonstrates many interactions (proximal, distal, one hand, two hands) in a demo context, using the Hand Tracking feature. For more information about how Hand Tracking works, please refer to the Hand Tracking Integration section.

# Build samples scene

First, make sure to have Hand Tracking enabled in the OpenXR project settings.

Then, import samples from QCHT Unity Interactions package.

Now, import AR Session and AR Session Origin to QCHT Sample - Menu scene, as explained here. Into AR Session Origin, add Spaces Hand Manager component to enable Hand Tracking. For more information about Hand Tracking scene setup, please refer to this page.

Spaces Hand Manager

Once it's done, you will find all the scenes in Assets > Samples > QCHT Unity Interactions > 3.2.10 > QCHT Interactions Samples.

To try those samples in editor mode, open QCHT Sample - Menu scene available in Assets > Samples > QCHT Unity Interactions > 3.2.10 > QCHT Interactions Samples > Menu > Scenes. To interact in editor simulation mode, please refer to this page.

Important

Before testing Hand Tracking samples, it is necessary to add all scenes to the Build Settings.

# Sample description

This sample is split into 4 scenes. The QCHT Sample - Menu main scene which combines all of them.

The main menu allows you to switch between scenes, interactions and hand avatar. In AR context, we recommend not to display an avatar.

# Proximal

# Simple interaction

To interact proximally with a 3D object without snapping (like the blue cube), import a QCHTAvatar into your scene in simple interaction type. Then, assign to your game object a box collider (in trigger mode) and interactable script.

# Snapping

When the user interacts (by doing a pinch or a grab for example) with the pink cube, the avatar of his hand will snap on it, as preset in the Hand Pose Editor.

Proximal interaction with snapping needs a QCHT Avatar in VFF Interaction type. The snappable object works mostly with the GrabPoint system and the HandPose generator.

# Distal

This teapot is interactive when the user targets it with raycast coming from their hand. By making a pinch, it can move, rotate and resize it thanks to the Control box component.

# UI Elements

# Virtual Force Feedback

The buzzer allows the user to press the button. When it is pressed, the virtual hand reacts to the object physics.

QCHT Avatar prefab allows a natural interaction with proximal elements by reacting to Physics system. The physics object needs a Collider (non trigger), a Rigidbody and an Interactable component.

# Distal elements

For UI elements, the manipulation can be made with raycast system. There are many elements to interact with, such as radio button, checkboxes, sliders, scroll view and buttons. To manipulate the object, target it and pinch to select.

UI elements receive and translate all the events into Unity Standard. It is the link between the User action and the System reaction. Raycasted elements respond to all Unity callbacks. Please refer to Distal Interaction section to learn more about the raycast system.

# Draw

To enable drawing, import a QCHTAvatar into your scene and refer to the Drawing Manager, available in Assets > Samples > QCHT Unity Interactions > 3.2.10 > QCHT Interactions Samples > Draw > Scenes (for game logic).

The Pinch gesture starts the drawing action and the Open-Hand gesture stops it.