Computers stole your job; now they know your pain. Using a combination of facial recognition software and machine learning algorithms, researchers have trained computers to be dramatically better than humans at reading pained facial expressions. And they’re working on new programs to help clue you into what your friend, coworker, or client is feeling.
In a study released Friday (paywall) in the journal Current Biology, researchers asked 170 subjects whether the expressions of pain shown on faces in a series of videos were real or faked. They found that the humans’ collective empathetic ability was about the same as a coin flip—they read the expressions correctly only 50% of the time. Even after researchers trained the subjects to read the subtle, involuntary muscle triggers that experts use to tell when an emotion is being faked, they were only right 55% of the time.
With the same footage, the computer program could see which people were in real pain 85% of the time. “This is one of the first examples of computers being better than people at a perceptual process,” researcher Marian Bartlett told Wired.
Bartlett and others are looking to capitalize on this kind of research. She is a founder of Emotient, a company that has released an app for decoding facial expressions. The program uses a device’s onboard camera to capture people’s facial tics—even down to involuntary microexpressions—and decode them using a tree of options. The current prototypes are geared toward helping people in marketing and sales understand their clients’ reactions, but future versions could help people with autistic spectrum disorders read the emotional context of their interactions.
The presentation below explains how it works (click here if you can’t see the video):
[protected-iframe id=”382ee2e3a5ebbeb7ab58034ba0b43141-39587363-59844067″ info=”//player.vimeo.com/video/71060463″ width=”500″ height=”281″ webkitallowfullscreen=”” mozallowfullscreen=”” allowfullscreen=””]
Emotient isn’t the only company to see that this has business potential. Affectiva is a company with a similar product, called Affdex, which analyzes peoples’s reactions through their webcams as they watch videos. Marketers could use this information to calibrate the emotional impact of their sales pitches. To alleviate privacy concerns, customers willing to participate would choose an “opt-in” feature, the company says.
The biggest challenge is in taking these technologies out of the lab. Sensing emotion in a controlled video is one thing, but in the wild, emotion trackers will have to deal with crowds, shifting light conditions, and varying contextual cues. (Not to mention the awkwardness of carrying on a conversation with someone you’re viewing and computer-analyzing through a phone camera.) So far, Emotient and Affdex are both working on viable consumer models, and Emotient already has an app in beta for Google Glass.
Emotion-sensing fits into broader push for more seamless human-computer interactions, including Intuit’s Dragon personal assistant, Intel’s RealSense camera, and Facebook’s drive for better photo ID. It’s a little alarming, perhaps, to think that your computer might someday be able to read you better than your own family—but honestly, who do you actually spend more time with?