OK Glass? Google gives an augmented peek at a transparent future

You’re looking at me?
You’re looking at me?
Image: AP Photo/Seth Wenig
We may earn a commission from links on this page.

Until this week, we’ve mainly had the initial show-and-tell promo, runway shots and some random viewings of street tests to learn more about Glass, Google’s hotly anticipated augmented reality spectacles (formerly known as Project Glass). The company is now opening up limited-time applications for Glass to the public—units awarded—for $1,500—to devotees with “bold,” “creative” ideas for use tweeted with the hashtag #ifihadglass—which it will bestow at special “pick-up events” in New York, San Francisco or Los Angeles.

This is where the fun begins—when Glass is out in the world. Doubtless Sergey, Larry and company will place some restrictions on how their new tool can be used, but we can expect a diverse, rush of new uses for a device that puts the power of Google’s planetary databases atop the human cortex. Already some unexpected, and perhaps unintended uses are being suggested in the Twittersphere:


https://twitter.com/JStud_/status/304276717260918785
https://twitter.com/Gunsonfire/status/304291583421997057
https://twitter.com/misstillytilly/status/304290409541144577

Alongside the open applications for a pair of the specs, Google has released an updated video showing new functionality, including a new verbal command, “OK Glass…”  which will undoubtedly join “Siri…” in the lexicon of soon-to-be-annoying out-of-context mumblings coming from behind you on the subway or grocery store. (This type of person has already been given a new nickname.) The video, a collection of highly experiential, visually compelling moments depicted as “everyday” for the tech-savvy consumer, also shows the expanded visual search and photographic capabilities the Glass team have been rumored to be perfecting: the ability to “look and shoot” at most anything or call up a picture no one else can see.

Where Twitter’s new video app Vine cuts short shared videos of the mundane at a merciful six seconds, Glass is likely to give us longer, perhaps more diverse, and even narrated first-person views of the world that GoPro fiends and Russian dashcams only hint at. Expect a lot of scenic hiking videos, but also know that we will see a few barfights, car crashes, and a hefty helping of Glass-enabled amateur porn, the first refuge of most new media technologies. A feature film shot with Glass is surely not many years off.

Privacy issues will undoubtedly leap to the fore quickly, as they have with search, cameraphones, and social media, all of which are direct forebears of Glass. Some of the sadder aspects of this intersection were explored last year by Eran May-raz and Daniel Lazo in their short film “Sight,” which gives us a taste of enhanced, augmented sense given more “human” application, reminiscent of Black Mirror, the British dystopian cult hit TV series now in its second season. Already apps like SceneTap, which provide smartphone users with the ability to see the demographic breakdown (including gender and age) of crowds in nearby bars, give us a hint at what augmented “sensing” can look like, and in particular where some developers take these tools. Glass puts this sort of capability in a walking heads-up display. And with the type of visual search and object recognition technology already embedded into Google’s search apps, where there’s data, there’s correlation—making real-time search and identification, complete with the subject’s trail of digital exhaust in search returns, an even more mobile, ambulatory, first-person reality.

This isn’t to say that Glass itself is inherently creepy, any more than the Web or Google has been. We can look to past experiences as an indicator of future performance—there will be unexpected and unintended consequences from this leap in form. And don’t forget, Google releases no product or service that doesn’t also feed its own databases with knowledge about the world around us (and to sell ads based on this knowledge). When we get “eyes,” so do those databases.