The next frontier in television: TVs that watch you back

“We have a recommendation for you.”
“We have a recommendation for you.”
Image: Reuters/Alexandre Meneghini
By
We may earn a commission from links on this page.

Imagine a world where your television understands your viewing tastes better than you do. Though you might tell yourself you’re enjoying the German arthouse film on the screen, your TV wouldn’t be fooled. Able to monitor and analyze facial movements, it would register your boredom and recommend something it knows you enjoy.

This sci-fi-esque possibility may be technically achievable, thanks to software developed by a media startup called Affectiva. Researchers used viewers’ home webcams to monitor the facial movements of more than 1,200 people as they watched advertisements for sweets, pet supplies and groceries. The model accurately determined whether they enjoyed the video, according to the study published in the peer-reviewed IEEE Transactions on Affective Computing this month.

The system could potentially be harnessed by streaming services like Netflix, says Daniel McDuff, Affectiva’s principal scientist. “You could imagine suggesting TV programs or movies that people could watch,” he told the New Scientist. In other words, the days of maintaining you watch “Jersey Shore” ironically may now be numbered.

Though this possibility might sound unnerving, consumers may be willing to allow their TV to gather their emotional data in return for accurate recommendations of new viewing material, says Peter McOwan, professor of computer science at Queen Mary University of London who was unaffiliated with the study.

“Private companies have already amassed significant information about individuals,” he tells Quartz via email. “Information on emotions would just be another data point to use, information we give over for the services we want when we tick the box ‘to accept terms and conditions.’”

However, the technology is not foolproof, warns McOwan. Emotional reactions are affected by an individual’s day at work or personal context that the software cannot read. And of course, facial expressions don’t always give perfect insight into emotions. “If we could read facial expressions accurately to discover the inner truth then we wouldn’t have con men, magicians, poker players and possibly some politicians,” says McOwan.

The legal implications might be thorny as well. If face-scanning technology expands, we could need further legislation to regulate when people’s faces are monitored and how this data is shared and used.

Then again, those consumers concerned about face-scanning TVs should have a fairly straightforward recourse. “If you think it’s an invasion of privacy,” says McOwan, “then put sticky tape over your webcam or press the off button.”