Facebook thinks the most useful digital assistant is the one that can read minds

I think therefore I type.
I think therefore I type.
We may earn a commission from links on this page.

Facebook wants digital assistants like Siri and Cortana to shut up already.  Every tech giant has an entrant into the digital assistant market, and early returns are positive. People all over are asking Alexa to play them a song, Google Assistant to set a reminder, and surely someone has asked Bixby—Samsung’s new voice assistant—something, but it turns out that no one wants to PDA (personal digital assist) in public. The Facebook solution: give the assistant direct access to our minds.

At last year’s Facebook conference, Regina Dugan, head of Facebook’s moonshot division, Building 8, recycled a well-known—in BCI (Brain Computer Interface) circles anyway—video of a woman in a Stanford lab moving a digital cursor with her mind. She did this to show that the bones of the technology that can give us mental control over our computers is already in place. She then explained that Facebook wants to take that technology and create a product to allow us to type with our minds. But Dugan left a lot of questions. First among them, why would Facebook want its users to mind-type? And only slightly less important, how do they intend to make that happen?

Mark Chevillet, the head of Building 8’s Projects, addressed those questions at last month’s ApplySci’s Wearable Tech Conference in Boston. For reasons of privacy or propriety, he explained, people are loathe to ask Siri anything out in the open—uses at work and in public account for only 19% and 3%, respectively. “One aversion signal to the otherwise beneficial aspects of these voice assistants is that you have to speak out-loud, you have to communicate through this unconstrained channel,” said Chevillet. “The value proposition is essentially, can we give you the speed and flexibility that you’ve come to understand from voice interfaces, but with the privacy that you’ve come to expect with text?” A digital assistant that can literally listen to your thoughts, anywhere and at anytime, and privately.

As to the how they plan to do it, Chevillet showed the same video of a woman moving a cursor with her mind then acknowledged that moving a cursor, up, down, left, right, and “click” is a far easier brain-to-digital translation than deciphering the complicated signals that turn thoughts into spoken or typed words. Then he pointed, as proof-of-concept, to work by Christian Herff, a computer scientist working out of the University of Bremen, that showed limited but promising results when it comes to translating thought to type.

When Herff’s subjects were constrained to just 10 words, and were told to speak them out loud, the computer was able to read and translate the brain signals accurately around 75% of the time. When the dictionary was expanded to 100 options, the computer’s accuracy in translating the brain signals dropped down to just less than half.

But Facebook’s goal is to achieve 100 words per minute, straight out of person’s brain, silently. When relayed to Herff in an interview, he was surprised, “Wow, that’s even faster than we talk audibly. Even decoding audible speech from brain signals is at a very, very early stage.”

To make it more complicated, for Facebook’s idea to be marketable at all, the interface has to be non-invasive. But Herff’s study made use of a neural implant, like the one used on the Stanford subject, and the best results had the most electrodes on the most parts of the brain. “No one wants their head drilled open to communicate on Facebook,” Herff said, “On the other hand, in all of the sensor technology I know of, there is no signal that provides good enough spatial (can read a small enough area) or temporal resolution (can do it fast enough) to even remotely make this possible.” He illustrated the strengths and shortcomings of each possible interface in a review article published last September in the journal Frontiers in Neuroscience.

Chevillet isn’t ignoring these obstacles. Building 8 works in two-year cycles, he said, and this project was just one year into the “is-it-even-possible?” phase. As to an actual, consumer-ready product, he cautioned that may be 10 or more years out. That said, Facebook’s skunkworx is partnered with research departments at 17 universities, including Stanford, Harvard and MIT, to help develop this technology. And with all the resources CEO Mark Zuckerberg and Dugan want to throw behind the effort, even Herff isn’t ready to limit what they’re capable of doing.”They are very secretive,” he said. “They just might have something up their sleeves.”


Correction: Regina not Cheryl Dugan is the head of Building 8.