In one of the earlier article, we looked at different use-cases for AR and how it made sense as a sales/marketing or training/learning tool.
This article explores a new development in Augmented Reality : Object Tracking.
To the uninitiated, especially those who’ve never seen AR in action, object tracking (or OT, as we will call it for the rest of this post) is (at-least visually) the equivalent of an image stabilizer on your television to stop the image from flickering up and down. It makes the rendering smoother, easier on the eyes.
Object tracking doesn’t merely process the trigger as a 2D object over which the AR effect must be overlaid, as happens in instances without OT. Instead, it considers the trigger as a 3D object – and therefore, will consider the trigger’s length, depth and width. This means that the same trigger can fire the render from multiple angles, unlike a non-OT application in which the trigger must be sufficiently head-on to the camera.
Object tracking marks the transition of AR technology from a cool tool with limited utility to an even cooler one with a wider range of applications. For instance, as anyone who has worked on real estate solutions built on (non-OT) AR knows, a paper-based image (2D) trigger has certain limitations. For one, the image must always face the camera. For another, the triggering image must be clear enough – and the scanning camera intelligent enough – for the application to be processed.
OT eliminates this problem by, as mentioned earlier, consider all three dimensions of the trigger. While a non-OT AR render fails when the camera is tilted too tightly with respect to the trigger, OT-AR succeeds because it can recognize the image from a wider range of angles. Thus, in the same case as above, OT-AR will provide you a more stable output that can be seen at more angles when the trigger is scanned.
The application of OT-enabled AR spans industries and purposes.
In the automotive or medical industry, information can be expressed in a way that’s easier to understand – and thereby, reduce the comprehension time. This difference of a few seconds could be the difference one way or another between life and death. Sensors attached to the various parts of the body – whether the body is an automobile or a human’s – can feed information; these information panels can be displayed in real-time against the corresponding parts. In future, with wearable technology becoming more capable, it might not be impossible for a diagnostician to simply sweep his/her goggles over the vehicle or patient, and immediately connect key indicators to key systems.
As far as training and learning systems are concerned, OT’s impact may be felt to a lesser extent – but it does not mean it’s a wasted effort. For instance, a child could learn more about an ant colony, one of the most complicated structures nature has come up with, through OT-AR. As he/she moves around the marker, the view rotates accordingly on all axes. With proper sectional charting, the child might very well understand not only the physics but also the subtler science of community engineering and philosophy that governs an ant colony.
And when nature itself can be replicated to a fair extent digitally, can man-made artefacts be any more difficult? Employees can be trained on assembling, installing, maintaining and repairing complex machines with AR, and the job only gets easier when you have object-tracking. This means that employees can approach such a problem from various angles and perspectives, and still not have to pray for an AR render. Industrial training bodies estimate such virtual training to save about 120 work hours, the equivalent of 15 working days, per employee as compared to traditional training methods.
Object tracking, much like AR, once seemed to be the stuff of dreams. But it is here now, and it is as much a part of reality as anything else is. With tools like Vuforia and Unity offering object-tracking functionality, the work might not have gotten far too complex for the average programmer; however, it does mean that the AR experience for the user is certainly getting better day by day.