The fuzzy outlines of mobile computing’s next phase are finally resolving into a clear image of a landscape populated by smart glasses.
Of course, not everyone can see it yet. The fact that smart glasses come in so many different varieties—sometimes even confused with virtual reality (VR) headsets—has only added to the struggle for some to grasp these devices as a new mobile platform.
But as these devices gradually expand beyond early adopters and begin to populate the faces of casual mainstream users, the usefulness of smart glasses will become more apparent.
Most smart glasses fall into a few categories:
- frames equipped with speakers, microphones, and cameras
- frames that offer a flat, two-dimensional image display on the lens (Google Glass, for example),
- frames and headsets that deliver a three-dimensional image and spatial audio.
The last two examples are generally referred to as augmented reality (AR) computing interfaces. And their sales are expected to take off next year.
Powered by competing cloud computing layers tied to wireless networks, soon mobile computing will no longer be exemplified by a person looking down at their smartphone.
There are already a number of smart glasses and AR advances in play in China, South Korea, and Israel, to name a few. But as they wait for lens display and battery technology to catch up to their ambitions, most tech companies have largely focused on simpler, smartphone-assistive models.
Here’s a look at the different devices already available for sale to the general public in the US (a key market for indicating mainstream viability), along with what to expect from the most anticipated potential new entrants into the category.
Amazon’s first plunge into hardware began with the Kindle ebook reader in 2007 and has since expanded to everything from smart speakers to interactive video displays. Like the Kindle, Amazon’s Echo Frames ($249), which were released in 2019, connect the user to the company’s vast array of cloud-based e-commerce content and services.
In this case, the innovation is that Echo Frames allow the wearer to interact with Amazon’s digital assistant Alexa using voice commands. Equipped with microphones and open-ear headphones (essentially, speakers that most others nearby can’t hear), the slim, normal looking frames can be modified with prescription lenses, and give the wearer the ability to listen to music, podcasts, take calls, and verbally send messages. In case the user’s environment is too noisy for voice interactions, the frames also have a touch panel on the arm that can be used to control functions.
So far, they only come in one frame shape and just a few different colors, along with a sunglasses variant. Offerings from audio giant Bose (Bose Frames) and gaming’s Razer (Razer Anzu) may offer slightly better designs and are competitive in some feature categories, but the tight integration with Amazon’s voice assistant and various cloud services give Echo Frames the clear advantage in this category.
Coming into focus: Amazon hasn’t released a meaningful update to its Echo Frames product in a while, but based on its consistent updates to other Echo hardware products, there’s a good chance an update is coming within the next year.
Ray-Ban Stories are the first true glimpse at what a world of high fashion smart glasses everywhere will look like
The Ray-Ban Stories frames ($300) are by far the most fashionable smart glasses on the market because, along with a couple of other design options, it uses the classic Wayfarer frame made famous by film icons like James Dean and Tom Cruise. The device, which is the result of a partnership between Meta’s Facebook division and Ray-Ban (part of the EssilorLuxottica group), allows the user to take still photos and videos (up to one minute) via a barely perceptible front-mounted camera.
Like Amazon’s Echo Frames, Ray-Ban Stories connect to the user’s smartphone through Bluetooth, where a dedicated app offers control over privacy features, photo and video sharing, and even what kind of voice the Facebook Assistant (triggered by the wearer’s voice) uses to respond to the wearer.
If there’s one vital lesson Google Glass taught the smart glasses industry since its 2013 release and subsequent fall from grace, it’s that people want to know when they are being recorded or photographed by smart glasses. To that end, Ray-Ban Stories features a small LED indicator light that turns on when the user records video or takes a photo.
The LED notification feature was pioneered by Snap through its Spectacles smart glasses. But embedding such a feature into a pair of classic Wayfarer frames is part of why the Ray-Ban device may have the best chance of any competitor at going fully mainstream, as the frames are almost completely indistinguishable from a pair of normal Wayfarers.
Coming into focus: Meta has a fully AR-capable device in development called Project Aria that could eventually be released as a successor to its Ray-Ban Stories product.
Although most social media users know Snap for its Snapchat app, the company also is notable for being among the first to bring a fully mainstream-friendly pair of (somewhat) smart glasses to market. The first generation of the Spectacles camera glasses, introduced in 2016, initially had all the buzz of a new Apple product, with sales kiosks popping up around the US. But eventually, the excitement died down.
Even in Spectacles’ buzzy first year, the device only sold around 150,000 units. Since then, it’s been fairly rare to see a pair of Spectacles on the street. But that hasn’t stopped Snap from continuing to improve the device and release new frame styles and features. Spectacles 3 ($380), which was launched in 2019, was the first version of the device that combined the frame’s video capture ability with 3D depth capture and post-recording AR features. They weren’t quite AR smart glasses, but getting close.
It wasn’t until 2021 that Snap released its first truly AR-capable smart glasses, loosely referred to as New Spectacles. Sporting a design similar to Balenciaga’s LED frames made popular by Kanye West, the New Spectacles not only display fully 3D animated content, but they also offer hand tracking, which allows the wearer to interact with and control virtual content in the same way they would with larger, high-end AR headsets like Microsoft’s HoloLens 2 and the Magic Leap 2.
Unfortunately, the device isn’t for sale, as the company continues to work on polishing the product’s various features. But a select group of Snapchat third-party developers have been allowed to use New Spectacles to develop apps for what could be a wide release in the future.
Coming into focus: Although Snapchat has 332 million users, that number is far below the billions of people using Amazon and Facebook software. So while Snap was earliest to the smart glasses race among the three, and delivers some of the most advanced computer vision research and applications, the smaller scale of its platform will make competing with larger platforms difficult as smart glasses adoption takes off.
All of this activity on the consumer smart glasses front is why so many analysts and tech industry watchers are so focused on the increasingly credible rumors that Apple has a smart glasses device on the way. While Amazon falls short on high-end style, and Facebook struggles with the public’s trust around personal data, Apple has usually returned positive marks on both, along with the massive scale of roughly half a billion weekly App Store users.
The other Big Tech brand poised to enter the fray is Google, which bought Canadian smart glasses startup North in 2020. Prior to that deal, North introduced its own pair of stylishly designed smart glasses with audio features, as well as 2D text and notification content displayed to the wearer.
Since the acquisition, Google has ramped up hiring for its AR hardware and software units, sparking speculation that North’s Focals smart glasses may reemerge as a Google product, harnessing all the features of the tech giant’s Search and Maps services. The device, which is reportedly code-named Project Iris, could be available in 2024, not long after Apple is forecast to launch its AR wearable.
What this all means is that the often vague concept of the metaverse will come to fruition in 2023, when the AR-wearable face wars truly heat up.