Augmented Reality (AR) requires the software to track the correspondence between the world we live in and the virtual space we have created. The result is an AR experience. To make this experience as realistic as possible, the AR software is required to track our surroundings and find points of reference. These points can be created by tracking an image, a surface or an object.
World tracking gives the device environment awareness by tracking flat horizontal surfaces in the environment of the user. Apple calls this technology World Tracking, Google calls it Motion Tracking. Both are part of their respective AR frameworks (ARKit and ARCore) and they share the fundamental approach of detecting surfaces or planes. As these frameworks are advancing, newer versions are capable of detecting additional surface orientations.
World tracking is particularly good for larger scenes that include a large part of the user’s environment. They do require a relatively new ARKit or ARCore compatible device.
Image tracking works independently of ARKit/ARCore and is therefore also feasible for older devices. It especially excels in smaller scenes that are built around a marker. A marker is a 2D image that is used as a point of reference. It’s an advantage is that it not only works on horizontal surfaces, but also on vertical surfaces, and everything in between.
Similar to image tracking, object tracking is recognizing arbitrary real-world 3D objects as a point of reference for the digital content. The object needs to be scanned before, so the software is capable of recognizing it.
Onirix is capable of performing image and world tracking, with object tracking coming later this year. Our mobile apps are changing automatically in between the different tracking modes, depending on the capabilities of the device and the environment.