Augmented Reality Virtually Useful
Glitchy gimmicks will give way to viable tools
As cloud computing was in 2007, augmented reality (AR) is a real, valuable but seemingly nebulous technology that’s buzzworthy enough to capture the attention of even nontechnical folks, but nascent enough that discussing it means first attempting to define it.
What is It?
The University of Washington’s Human Interface Technology Lab defines augmented reality as “the use of transparent HMDs (head-mounted displays) to overlay computer-generated images onto the physical environment.” While the image of a pilot with a map and runway projected inside a helmet is a helpful example, defining augmented reality by which hardware it happens to employ is unnecessarily limiting. Some of the most popular AR applications run on smart phones.
MIT’s broader definition—the combination of the real and the virtual to assist users in their environments—includes the vital standard that AR be more than a projector. It’s interactive.
An early, widely cited definition comes from a 1997 paper by researcher Ronald T. Azuma, who said AR systems have these three characteristics: They combine real and virtual; they’re interactive in real time; and they’re registered in 3-D.
Search our new 2013 Buyer's Guide.
Trends | IBM offers smarter systems for performance and scalability
Web Exclusive | Data experts aim to balance privacy risk, research potential