# Image Tracking Sample
This sample demonstrates how to detect and augment image targets found in the real world.
For basic information about custom trackable object updates and what Unreal Engine's
AR Trackable Notify component does, please refer to the Unreal Engine documentation (opens new window).
it has to be enabled in the OpenXR plugin settings located under
Project Setting > Snapdragon Spaces plugin.
# How the sample works
By default, when the sample is running and recognizes an image, it generates an augmentation over the physical target. The sample only identifies one image at the moment and displays its world location in the UI panel included in the map.
# Image AR Manager
BP_ImageTrackingManager blueprint asset (located under
Content > SnapdragonSpaces > Samples > ImageTracking > Placeable) handles creating and destroying
BP_AugmentedImage objects through an event system.
It binds events coming from the AR Trackable Notify component (opens new window) to react to AR trackable images changes. When the system is detecting images, it invokes the On Add/Update/Remove Tracked Image events.
In the sample blueprint,
Toggle AR Capture has to be set to ON, if detection should start, and to OFF, if it should stop detecting targets and to destroy all generated AR images.
Additionally, Scene Understanding must be set as the capture type of that node.
# Image AR Session Config
The system starts using the
D_SpacesSessionConfig_ImageTracking asset (located under
Content > SnapdragonSpaces > Samples > ImageTracking > Core) to detect the images. This asset is a data asset derived from the
The session config file provides three fields: a field for defining the image size, a field for specifying the maximum number of simultaneous images that should be tracked, and a field for referencing candidate images to be tracked.
# AR Candidate Images
Unreal Engine uses a dedicated asset type called AR Candidate Image (opens new window). to create the references of the images that the XR System should track. Developers can add as many
AR Candidate Images as desired and assign them to the array indicated in the
AR Session Config.
To create an
AR Candidate Image, the image to be tracked has to be first imported as a texture asset into the Content folder project. The created texture asset must have UserInterface2D (RGBA) set in the compression settings and it is recommended to turn of mip maps.
You can find the used reference images in the Image Targets for Testing section
The next step is to create the
AR Candidate Image asset, the Candidate Texture field references the created texture asset. Each
AR Candidate Image should have an unique identifier that can be set in the Friendly Name field. Otherwise, any identical names in separate candidates used in the same
AR Session Config will cause a hash code collision.
The last step is to define the physical size of the image in centimetres through the Width/Height fields. The correct measurements are essential for a correct pose estimation and subsequent placement of an augmentation. This data is automatically filled in, considering the proportion of the image following the orientation defined in the Orientation field. Unfortunately, Unreal Engine currently has the orientation inverted, so the developer must use Landscape for portrait images and Portrait for landscape images.
The sample is using the
D_ARCandidateImage_SpaceTown blueprint asset (located under
Content > SnapdragonSpaces > Samples > ImageTracking > Placeable). Additionally, the image target was measured at 26 cm in height (when printed in DIN A4 or US letter). The
BP_AugmentedImage blueprint asset (located under
Content > SnapdragonSpaces > Samples > ImageTracking > Placeable) is rendering the candidate image texture over the physical image target upon recognition and tracking.