A simple mental trick can help you figure out who’s telling a lie

Does your unconscious mind see any deception?
Does your unconscious mind see any deception?
Image: Reuters/Jonathan Ernst
We may earn a commission from links on this page.

Last week, when Senator Martin Heinrich questioned ex-FBI director James Comey during the Senate Intelligence Committee hearing, the New Mexico Democrat suggested that understanding what happened in private discussions between Comey and US president Donald Trump comes down ultimately to which man one chooses to believe. “Do you want to say anything,” he asked Comey, “as to why we should believe you?”

For a brief moment, as his question hung in the air, it was tempting to put aside the reports, the politics, the personal histories of the two men in question, one’s private fantasies about how the investigation could end, and check your instincts: Did this person just lie to us?

It wouldn’t be the most ridiculous approach, according to studies in forensic and social psychology, and neuroscience. Several experiments have demonstrated that our gut instincts seem to be better than our reasoning mind at discerning truth-tellers from liars—a trait scientists have also discovered in non-human primates.

For example, in a 2014 study published in Psychological Science, Leanne Ten Brinke, then a forensic psychologist at the UC California of Berkeley School of Business, found through a series of experiments that undergraduate students were significantly more likely to be accurate in guessing who was hiding a mock theft and who wasn’t when they made automatic decisions (for instance, using an Implicit Association Test), compared to when they judged the same individuals through direct decisions.

These choices—linking suspects’ faces as they flashed on a screen with words like “untruthful” and “dishonest” or “truthful” and “genuine”—were made instantly; the subjects did not have enough time to explicitly make note of hand gestures or duration of eye contact, or other body language-based methods that have also been studied for telling patterns. “These results provide strong evidence for the idea that although humans cannot consciously discriminate liars from truth tellers,” the authors explained in the study, “they do have a sense, on some less-conscious level, of when someone is lying.”

In other words, although we know from previous studies that, in the absence of hard evidence, a person’s odds of making the right judgement call about whether someone is telling the truth is about as good as chance, that may be only when we’re thinking about it. If you can train yourself to listen for an inner signal, it will probably point you in the right direction.

More evidence to this effect comes from a 2009 study from the University of Texas at El Paso, published in the Journal of Experimental Psychology, which found that students who were engaged in concurrent tasks, and therefore enduring a heavier cognitive load, were better able to spot people who were telling fibs. For those subjects, being too distracted to fully analyze their target was actually beneficial.

Scientists have also uncovered physiological reactions to deception that hint at this intuitive human capacity. Through a number of studies involving fMRI scans, neuroscientists have detected activity in the amygdala, the part of the brain associated with fear and the emotional reaction to a threat, when someone is judging a person’s intentions to be misleading. And in a study that appeared in Frontiers in Psychology in 2015, and included 191 subjects, researchers discovered that a person’s finger temperature declined over time as they watched a three-minute video of liar telling a story, whether or not they were instructed to be on guard for deceit. The findings were intriguing, but also mixed: The subjects’ finger temperatures increased as they viewed a truth-teller, as would be expected, but only when they were first told they’d be asked to distinguish between who was being honest and who wasn’t.

It’s possible that humans have an innate ability to sniff out fabrications and recognize honesty because lying and deceiving is so much a part of how we’ve evolved. Researchers believe humans learned to make up fake stories shortly after we began to use language, as National Geographic reports.

Some smart individuals realized that to secure resources or evade an enemy, just making up a fabulous story would be easier than instigating a violent attack. But as much as learning to deceive is a natural part of how we come to understand and survive in our highly social communities (children begin telling white lies out of compassion at around age 7, for example), humans have also benefitted from trusting people and cooperating.

That’s what makes it impossible for us to consistently spot a liar with accuracy. What’s more, as Ten Brinke also reports in her 2014 study, it remains unclear whether the evolutionary arms race has favored an implicit skill for knowing when someone is being inauthentic (a possible threat), or being genuine (a possible ally), or both.

The scientists do not offer any practical suggestions for how to put their study results to good use. But according to Malcolm Gladwell, the journalist and author who examined what he called “rapid cognition” in his book Blink: The Power of Thinking Without Thinking, it could be a matter of taking seriously the snap judgements our brains constantly make from thin slices of data. “I think it’s time we paid more attention to those fleeting moments,” he says in a Q & A published on his site. “I think that if we did, it would change the way wars are fought, the kind of products we see on the shelves, the kinds of movies that get made, the way police officers are trained, the way couples are counseled, the way job interviews are conducted and on and on–and if you combine all those little changes together you end up with a different and happier world.”

But Gladwell also acknowledges that acting on hunches can sometimes lead us astray. The many ways that our intuited understanding of a situation can be manipulated or fall victim to cognitive biases are even further examined in Daniel Kanheman’s, Thinking, Fast and Slow, the landmark book in which the psychologist explores in detail the dual-processes that help us make decisions.

In his testimony, Comey didn’t suggest a gut check, of course, but in answer to the question of whether his word should be believed over Trump’s, he delivered one of the most memorable lines from the testimony. “My mother raised me not to say things like this about myself so I’m not gonna,” he told Heinrich, adding, “I think people should look at the whole body of my testimony.”

Science suggests that in making that assessment, we should give our gut responses some weight, too.