Skip to navigationSkip to content

Scientists rigged together a GoPro and heart-rate monitor to measure your racial bias

A demonstrator faces a line of police in front of the Chicago Police Department
Reuters/Jim Young
There’s something in the air.
  • Lila MacLellan
By Lila MacLellan

Quartz at Work reporter

This article is more than 2 years old.

Studies into implicit bias and body language have told us that the body is always truthful, even when the person in charge of it isn’t. A doctor may say that she does not treat patients differently based on race, for example, but bodily clues may indicate otherwise.

The ability to “read” those bodily clues, however, has always relied on subjective analysis of interactions. Usually, an expert will watch a video and make claims about what a gesture, pose, or gaze is communicating.

Now a team of computer scientists and psychologists in Italy have developed a way to read body language that’s seemingly more objective, which they’ve tested in a study of hidden racial biases.

Using a GoPro camera, a Microsoft Kinect motion detector, and a heart monitor, the team of computer scientists and psychologists measured body cues during conversations between two speakers. Specifically, they looked at things like the distance between the subjects; upper, middle, and lower body movement; and the amount of silence during conversations.

These interactions occurred between students aged 20 to 25 at the University of Modena and Reggio Emilia, where the study was conducted. All of the 32 participants first took an Implicit Association Test to uncover hidden biases about race. (Unlike an explicit test, which asks a person for their thoughts on race, the IAT measures how long a person takes to match specific concepts with particular identities.)

In the Italian study, each subject held the same conversations, first paired with a white student and then with a black student, both of whom had been given instructions by the scientists. Once the pairs of students were introduced, they were asked to talk about a specified frivolous topic for three minutes. Then they would switch to something that was more associated with race, like immigration policy, for another three minutes. Their interactions were filmed, and their kinetic movements were mapped while their heart rates were measured.

The scientists then fed the data into a computer algorithm designed to find correlations between answers students gave in the IAT and their nonverbal behavior. For example, they found students who held stronger racial prejudices stood further away from their black conversation partners, used their hands to communicate less often (tending to freeze instead), and spoke more often, presumably to fill gaps in the conversation. People who did not score highly on the IAT were more at ease with occasional silence, according to the data.

The scientists presented these results at the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, recently held in Germany, explaining that when the computer tested its algorithm by looking back at the data, it was able to predict which students would have scored highly on the IAT test 82% of the time. They also acknowledged that their system required more controls to further isolate whether the subjects were reacting to race or to something else, like someone’s attractiveness or what was actually said in the conversations.

The team is currently testing the technique to identify implicit biases toward people with HIV. In practice, the authors say the system has many possible applications. For example, a border control agent could one day use the approach to spot people who may be dangerous, and schools could use it to identify children with anxiety.

📬 Kick off each morning with coffee and the Daily Brief (BYO coffee).

By providing your email, you agree to the Quartz Privacy Policy.