News

We Learn Faster When We Aren’t Told What Choices to Make

In a perfect world, we would learn from success and failure alike. Both hold instructive lessons and provide needed reality checks that may safeguard our decisions from bad information or biased advice.

But, alas, our brain doesn’t work this way. Unlike an impartial outcome-weighing machine an engineer might design, it learns more from some experiences than others. A few of these biases may already sound familiar: A positivity bias causes us to weigh rewards more heavily than punishments. And a confirmation bias makes us take to heart outcomes that confirm what we thought was true to begin with but discount those that show we were wrong. A new study, however, peels away these biases to find a role for choice at their core.

A bias related to the choices we make explains all the others, says Stefano Palminteri of the French National Institute for Health and Medical Research (INSERM), who conducted a study published in Nature Human Behaviour in August that examines this tendency. “In a sense we have been perfecting our understanding of this bias,” he says.

Using disarmingly simple tasks, Palminteri’s team found choice had a clear influence on decision-making. Participants in the study observed two symbols on a screen and then selected one with the press of a key to learn, through trial and error, which image gave the most points. At the end of the experiment, the subjects cashed in their points for money. By careful design, the results ruled out competing interpretations. For example, when freely choosing between the two options, people learned more quickly from the symbols associated with greater reward than those associated with punishment, which removed points. Though that finding resembled a positivity bias, this interpretation was ruled out by trials that demonstrated participants could also learn from negative outcomes. In trials that showed the outcomes for both symbols after a choice was made, subjects learned more from their chosen symbol when it gave a higher reward and when the unchosen one would deduct a point. That is, in this free-choice situation, they learned well from obtained gains and avoided losses.

That result looked like a confirmation bias, with people embracing outcomes—positive or negative—that confirmed they were right. But there was more to it. The experiments also included “forced choice” trials in which the computer told participants which option to select. Here, though the subjects still pressed keys to make the instructed choices, confirmation bias disappeared, with both positive and negative outcomes weighted equally during learning.

This impartiality might seem optimal, yet the learning rates were slower in the forced-choice situation than they were in the free-choice one. It is as though the participants were less invested in the outcomes—showing ambivalence about learning from them somewhat like a child woodenly practicing their scales on the piano to please a parent.

Because the confirmation bias arose only during the free-choice situations, the authors dubbed it “choice-confirmation bias.” The tendency persisted in both poor and rich conditions, when rewards were scant or abundant. “Our human subjects were not capable of adjusting the bias as a function of the environment,” Palminteri says. “It seems to be hardwired.”

This observation means the brain is primed to learn with a bias that is pegged to our freely chosen actions. Choice tips the balance of learning: for the same action and outcome, the brain learns differently and more quickly from free choices than forced ones. This skew may seem like a cognitive flaw, but in computer models, Palminteri’s team found that choice-confirmation bias offered an advantage: it produced stabler learning over a wide range of simulated conditions than unbiased learning did. So even if this tendency occasionally results in bad decisions or beliefs, in the long run, choice-confirmation bias may sensitize the brain to learn from the outcomes of chosen actions—which likely represent what is most important to a given person.

“The paper shows that this bias isn’t necessarily irrational but actually a useful mechanism for teaching us about the world,” says Philip Corlett of Yale University, who was not involved in the study. He studies the origins of delusional thinking and agrees that an individual’s perception of control in a situation can shift their interpretation of the events around them. “Feeling as though you are the architect of the outcomes you experience is powerful and certainly would lead you to strengthen beliefs about those contingencies much more strongly,” he says.

The role for choice found here suggests that our sense of control in a situation influences how we learn—or do not learn—from our experiences. This insight could also help explain delusional thinking, in which false beliefs remain impenetrable to contrary evidence. An outsize feeling of control may contribute to an unflagging adherence to an erroneous belief.

Delusions can be a hallmark of psychosis, in which they may involve extreme beliefs about alien abduction or being a god. Milder delusionlike thinking also touches otherwise healthy people, such as a sports fan with a superstition about wearing a lucky shirt to ensure a team’s win. More harmfully, the current coronavirus pandemic has wrought some delusions of its own, such as one that holds that mask wearing causes sickness.

So a false belief remains fixed, and any outcomes that contradict it are not accepted by the brain. If choice is the point of reference that governs our learning style (with or without confirmation bias), then maybe something about choice or an inflated sense of control pushes people toward delusions. Perhaps individuals with delusions are choosing to have particular experiences to support a false belief and choosing to interpret information in a way that supports that belief. This possibility has not been tested. Questions for future research to answer, however, would be how beliefs are updated in a person with delusions and whether this process differs when choices are forced or made freely. To help individuals with delusions, the current findings suggest, it may be more effective to examine their sense of control and choices than to try to convince them with contradictory evidence—which, over and over, has not been shown to work.

Another question raised by this research is: What might influence a person’s sense of control? It may be an inherent feature of an individual’s personality. Or it could be more pliable, as suggested by a recent study of people in the military in Belgium published in Nature Communications. The paper reported a greater sense of control among senior cadets, who are further along in their officer training and give orders, compared to privates, who obey them. The latter individuals’ sense of control, also called agency, was equally diminished in both free-choice and forced-choice situations. “They don’t experience agency, even when they’re free to choose what to do, which should not be the case,” says study leader Emilie Caspar of the Free University of Brussels (ULB).

Whether a diluted feeling of control affected those subjects’ learning was not studied, and current work is examining whether this mindset follows participants beyond a military setting. But if a person’s sense of control influences the strength of that their choice-confirmation bias, it is interesting to consider the impact of 2020—a year battered by the pandemic and economic and political uncertainty—on an individual’s cognition.

“There’s this general sense that the rules don’t apply anymore, and that is really unmooring for people and can lead to unpredictable, irrational behavior,” says Corlett, who recently conducted a not yet published preprint study that tracked changing levels of paranoia before and during the the global spread of COVID-19.

It’s not clear whether the newfound choice-confirmation bias could inform public health messaging during a pandemic. For example, maybe voluntary mask-wearing should be encouraged and coupled with rewards for choosing to put on a face covering and occasional punishments for not doing so.

Palminteri says it is hard to extrapolate from his experiments to the messy, complicated and somewhat removed contingencies of mask wearing. But the stark bottom line is that biased thinking runs deep in the human psyche. “Even when the stakes are so high, you may think humans would behave rationally,” he says. “But that’s far from clear.”



Source link