Image Tracking Sample
This sample demonstrates how to detect and augment image targets found in the real world.
For basic information about custom trackable object updates and what Unreal Engine's AR Trackable Notify
component does, please refer to the Unreal Engine documentation.
In order to use this feature, it has to be enabled in the OpenXR plugin settings located under Project Settings > Snapdragon Spaces plugin.
How the sample works
By default, when the sample is running and recognizes an image, it generates a gizmo over the physical target. The sample only identifies one image at the moment and displays its world location in the UI panel included in the map.
Image AR Manager
The BP_ImageTrackingManager
blueprint asset (located under SnapdragonSpacesSamples Content > SnapdragonSpaces > Samples > ImageTracking > Placeable) handles creating and destroying BP_Gizmo_AugmentedImage
actors through an event system.
It binds events coming from the AR Trackable Notify component to react to AR trackable images changes. When the system is detecting images, it invokes the On Add/Update/Remove Tracked Image events.
In the sample blueprint, Toggle AR Capture
has to be set to ON, if detection should start, and to OFF, if it should stop detecting targets and to destroy all generated AR images. Toggle Spaces Feature
can be used as an alternative way for enabling the feature.
Additionally, Scene Understanding must be set as the capture type of that node.
Image AR Session Config
The system starts using the D_SpacesSessionConfig_ImageTracking
asset (located under SnapdragonSpacesSamples Content > SnapdragonSpaces > Samples > ImageTracking > Core) to detect the images. This asset is a data asset derived from the SpacesSessionConfig
class.
The session config file provides three fields: a field for defining the image size, a field for specifying the maximum number of simultaneous images that should be tracked, and a field for referencing candidate images to be tracked.
The creation of the image trackers happens in an asynchronous thread to avoid freezing issues when the number of images to track is very high. For this reason, sometimes the image tracking could start delayed. Please, listen to the On Spaces Image Tracking Is Ready
delegate to find out when images are ready to be tracked.
AR Candidate Images
Unreal Engine uses a dedicated asset type called AR Candidate Image. to create the references of the images that the XR System should track. Developers can add as many AR Candidate Images
as desired and assign them to the array indicated in the AR Session Config
.
To create an AR Candidate Image
, the image to be tracked has to be first imported as a texture asset into the Content folder project. The created texture asset must have UserInterface2D (RGBA) set in the compression settings and it is recommended to turn of mip maps.
You can find the used reference images in the Image Targets for Testing section
The next step is to create the AR Candidate Image
asset, the Candidate Texture field references the created texture asset. Each AR Candidate Image
should have an unique identifier that can be set in the Friendly Name field. Otherwise, any identical names in separate candidates used in the same AR Session Config
will cause a hash code collision.
The last step is to define the physical size of the image in centimetres through the Width/Height fields. The correct measurements are essential for a correct pose estimation and subsequent placement of an augmentation. This data is automatically filled in, considering the proportion of the image following the orientation defined in the Orientation field. Unfortunately, Unreal Engine currently has the orientation inverted, so the developer must use Landscape for portrait images and Portrait for landscape images.
The Snapdragon Spaces plugin offers the possibility to choose between different Tracking Mode if the AR Candidate Image asset parent is Spaces AR Candidate Image
.
Dynamic mode: Updates the position of tracked images each frame, and works on moving and static targets. If the tracked image cannot be found, no location or pose is reported. Used by default.
Adaptive mode: Updates periodically the position of static images if they have moved slightly (roughly once every 5 frames). This finds a balance between power consumption and accuracy for static images.
Static mode: Useful for tracking images that are known to be static. Images tracked in this mode are fixed in position when first detected, and never updated. This leads to less power consumption and greater performance, but the tracked image's position will not be updated if it suffers from any drift.
The tracking mode can be changed while the application is running without stopping or restarting the AR Session using the following nodes:
Set Image Target Tracking Mode by Friendly Name
.Set Image Target Tracking Mode by Candidate Image
.Set Image Targets Tracking Mode by Friendly Name
.Set Image Targets Tracking Mode by Candidate Image
.
SetImageTrackedModeByID
has been deprecated in version 0.15.0.
The sample is using the D_SpacesARCandidateImage_SpaceTown
blueprint asset (located under SnapdragonSpacesSamples Content > SnapdragonSpaces > Samples > ImageTracking > Placeable). The image target was measured at 26 cm in height (when printed in DIN A4 or US letter). The BP_Gizmo_AugmentedImage
blueprint asset (located under SnapdragonSpacesSamples Content > SnapdragonSpaces > Samples > ImageTracking > Placeable) is rendering a gizmo over the physical image target that indicates its orientation upon recognition and tracking.