Benjamin Franklin once quipped: “There are three things extremely hard: steel, a diamond, and to know oneself.” Every decision we make, from pinpointing the source of a faint sound to choosing a new job, comes with a degree of confidence that we have made the right call. If confidence is sufficiently low, we might change our minds and reverse our decision. Now scientists are using these choice reversals to study the first inklings of self-knowledge. Changes of mind, it turns out, reflect a precisely tuned process for monitoring our stream of thoughts.
There are two schools of thought on the issue of what it means to know oneself—and how changes of mind play a role. One suggestion is that changes of mind occur because we continue to weigh evidence after a choice has been made; this process is called “post-decision evidence accumulation.” An alternative idea is that the brain actively corrects its mistakes by engaging additional mechanisms after settling on a course of action. People who have damage to the frontal regions of the brain might be unable to “self-monitor” and identify errors they have made.
Now a recent pair of studies, each asking participants to form a rapid series of judgments about what they saw on a computer screen, illuminates the way we monitor our internal thoughts, and how self-correction occurs.
In one of the studies, researchers from Cambridge University, Columbia University, and New York University asked volunteers to decide whether a patch of flickering dots was drifting to the left or to the right by moving a handle in the corresponding direction; simultaneously, volunteers could indicate confidence in their decision by moving the handle up or down. As soon as participants moved the handle, the dots in question vanished.
Most of the time, the volunteers moved directly to their chosen target—upper right, lower right, upper left, or lower left. But every so often, volunteers changed their choice of direction and level of confidence, mid-move. What was happening? By comparing these patterns of behavior to the predictions of a computer model, the researchers found evidence that we change our mind from the bottom-up. Even when the dots were no longer visible on the screen, subjects continued to accumulate information in their neural pipeline, causing them to make changes in their decision, their confidence, or both.
A second team, from Trinity College Dublin and Leiden University in The Netherlands, has suggested another mechanism for how this works: from the top-down. Here, volunteers were outfitted with electroencephalography (EEG) caps to measure the brain’s electrical activity, and then asked to press a button each time a color word such as “red” appeared on a computer screen. However, they were instructed to not press the button if the word was repeated twice, or if the meaning of the word and the font color matched (for example, “red” written in red text). This is a difficult task to perform quickly, and mistakes were made on 43% of the trials on average. To assess self-monitoring, the volunteers pressed a separate button if they noticed they’d made an error.
So what did the researchers find? A signal from the centroparietal region of the brain, known for integrating sensory information, ramped up to a threshold level after a choice was made, indicating the same mechanism we use for perceiving external events is engaged when reflecting on internal decisions. Intriguingly, the researchers also detected theta waves in the frontal cortex whenever errors were detected. This finding suggests that a “quick and dirty” error signal in the frontal cortex can trigger the continued accumulation of evidence to work out whether a change of mind is warranted.
Both sets of studies confirm the importance of evidence accumulated after a decision has been made, but diverge on the source of this evidence. The Cambridge group suggests that an incoming stream of evidence is continually accumulated both before and after a choice has been made. By contrast, the Trinity College group suggests that top-down signals—information that feeds back to influence earlier stages of processing—provide an additional input to enable changes of mind.
Differences in the design of the two studies might limit the extent to which we can compare their findings, suggesting that future research should combine the approaches for more definitive results. For instance, it would be of interest to monitor EEG during experiments like those performed at Cambridge to establish the relative contributions of bottom-up and top-down influences on changes of mind.
Psychologists have long been interested in metacognition—the ability to reflect on and evaluate our own thoughts and behavior. The neural basis of metacognition is likely to be complex and multi-faceted, but the studies in question reveal that simple decision-making provides a good starting point. More than 200 years after Franklin’s death, we now know that steel and diamonds are constructed from simpler atomic and molecular building blocks. By studying the dynamics of simple decisions, we might eventually uncover the components of his third hard substance, self-knowledge.