weThink

What we're buzzing about.

Helping Consumers See the World Differently

danshust_med Dan Shust , Chief Technology Officer Mar. 6, 2012

Information soon may get automatically filtered and delivered right in front of people’s eyes, making the need to search smartphones for answers completely unnecessary. Google glasses—which may be on sale by year’s end—are rumored to use a built-in camera, image recognition, and motion and GPS sensors to monitor the world and overlay relevant information in real-time.

This could change the way people gather and use knowledge about the world around them. Glasses with an augmented reality (AR) heads-up display (HUD) put data even closer than the tips of a consumer’s fingers. They put it right before a person’s eyes, enabling a world where nearly all information is transparent and easily accessible. Imagine this scenario:

Michaela puts on her glasses and steps outside. Her AR overlay shows the restaurant she wants is three blocks ahead, near the park. A car zips past. Michaela nods. Her visual filter identifies the car as a Mazda, on sale at a dealer 1.1 miles away. She shakes her head and walks until a store window catches her eye. She stops and her glasses play a video that merges with the window display. She tilts her head to +1 the store. Just a block from the restaurant, Michaela opens an online menu, browses daily specials and sends a coupon to her phone—all without ever looking down.

With technology like this, it won’t be long before floating data overlays are just a part of the way people see the world. Information will get filtered and displayed right before their eyes, and they’ll never have to stop and pull out a smartphone. In a world filled with smart glasses, navigation will be easy, search will be automatic, and learning will be organic. Plus, deeper engagement with one’s environment, including signs, products and displays, will be virtually effortless (as long as people watch where they’re walking).

While these particular glasses haven’t arrived yet, the use of image recognition to launch AR and real-world data layers via smartphones is possible now and growing quickly. In fact, the OS World Market Forecast suggests that AR application downloads will grow from just one million in 2009 to 900 billion by 2016.

Companies have an enormous opportunity to enhance consumer views and deepen engagement. From displaying navigation points and local deals to offering interactive digital models and messaging, smartphone-accessible AR offers exciting ways to create engaging experiences. And those experiences will be even more interesting and seamless when the world is full of smart glasses from Google, Vuzix and Lumus. How will customers see the world in 2016? Better yet, how many will see that world through iGlasses instead of iPhones?

« »

Tell us what you think