Today, we tap and swipe to interact with our devices. But Google envisions a Minority Report-like future where people can control their phones, watches, and homes without touching them.
On the final day (May 20) of its I/O developer conference, Google showed off improvements to Soli, a radar-based technology that makes it possible to interact with devices using new types of gestures.
Born from Advanced Technology and Projects (ATAP), an experimental unit within Google, the company first introduced Soli at its developer conference last year. Since then, Google’s worked to reduce the footprint, power consumption, and computing resources required of Soli’s development kit so it can operate inside a smartwatch. “The way I think about it, if you can make something run on a smartwatch, you can make it run anywhere you want,” said ATAP technical lead Ivan Poupyrev.
On stage, Google demonstrated two examples of how people could control their devices using Soli. On a smartwatch, they could turn a “virtual dial” (a gesture that mimics adjusting an analog watch dial) to switch between apps or move a hand closer to the watch to display more detailed information from notifications. (The video below is cued up to the demo of the smartwatch.)
Since Soli’s radar sensors can detect gestures up to 15 meters away, the technology can also be used in the connected home, as seen in the video below. Showing off a prototype speaker made with partner Harman, which owns the JBL speaker brand, Google demonstrated skipping tracks using the virtual dial and turning the speaker off by waving a hand quickly.
Still, that doesn’t mean the touch screen is going anywhere. “What it does it offers a third dimension of interaction, which complements and enhances our interaction with touch screen and voice input,” said Poupyrev.