How Much Information Is Enough?

By Chuck Dinerstein, MD, MBA — Jun 27, 2019
Why do patients seek a second opinion? Even when making an "evidence-based" decision, our unconscious bias towards one option or another alters how we judge the evidence -- and how long we search.
Courtesy of Gerd Altmann from Pixabay

Physicians often have to provide unwanted news; very few patients came into my office wanting and hoping for surgery. And there were always a few that wanted a second opinion, not because of my demeanor but because they didn’t like what they heard and wanted to hear something different, less surgical, more pill or watchful waiting. I always encouraged them to get that second opinion because part of the underlying problem was that they didn’t trust me, and without the magic of trust, surgery becomes a far more difficult treatment for both patient and physician. 

With that as a background, it should make sense that a study on how we accumulate evidence in making a decision caught my eye. I am sharing it with you because I needn’t look further for proof that my belief that second opinions are about searching for the answer a patient desires, not a matter of poor communication or judgment. 

The researchers set up a scenario involving images of TVs and telephones on a conveyor belt, asking the participants [1] whether they were looking at the output of a company making predominantly one or the other. The longer you watch, the more likely you are to make the correct choice. Participants were rewarded or penalized based on how accurately they determined which factory they were viewing. To make one scenario more desirous than the other participants were “invested” into either the TV or the phone factory. They received an additional award if they had “invested” in the type of factory they were viewing and a penalty if they had invested in the alternative, but the two incentives were separate, their accuracy was unrelated to their desire.

As with any decision, there are always tradeoffs. More information makes a choice more accurate but requires more time to gather the data. And it is not difficult to see how a desire for one option over another might perturb that tradeoff, as the authors note, no one asks a doctor for a second opinion when the news is good. 

Participants wanted the more desirable outcome

“In sum, the results show that participants were more likely to believe they were in a desirable factory. They gathered less samples before making these judgments and required a smaller proportion of the samples to be consistent with said belief.”

  • Both groups judged which factory they were in, by the number of TVs and phones passing by on the screen. Whether they were invested or not, their accuracy was the same
  • But participants wanted that bonus, the good outcome, judging the factory to be the one they were invested in, and receiving the investment bonus, significantly more often than actually encountered.
  • When the participants felt invested, it took fewer images and time to be consistent in reaching their judgment. When they were not invested, it took more pictures and time to achieve the same conclusion. 

Just as I suggested in my second paragraph, we stop looking when we find what we want. The researchers were interested in whether this bias was due to an initial bias towards their invested "good outcome" or the rate of evidence accumulated, creating a mathematical model and comparing its predictions to the actual decisions by the participants.  

The model that best predicted the participants choices was biased towards a desirable outcome initially, and confirmatory evidence was weighted more than evidence that refuted the desirable state. In real life, these biases maybe and often are unconscious. 

It is rare for a patient to have "no skin in the game" when making a medical decision, there is always a bias at work, and I would suggest that a second opinion is one way to get more and “better” evidence that you were right all along. But the real power of this study comes from generalizing outside of medicine to decisions that have a more apparent emotional component. People begin from a biased position, and it’s not that they attend more to evidence confirming their view as it is that they weigh confirmatory evidence more strongly and settle with fewer proofs more quickly. 

Keep the study in mind when you look at the fights over vaccinations or glyphosate irrespective of which side of the argument you take. It turns out that evidence-based decisions may be biased by which evidence we believe and how much information is needed to satisfy our uncertainty. 

 

[1] 84 participants were found on Amazon’s Mechanical Turk, with a median age of about 34 and participated in 80 trials each. 

Source: Evidence accumulation is biased by motivation: A computational account PLOS Computational Biology DOI: 10.1371/journal.pcbi.1007089

Category

Chuck Dinerstein, MD, MBA

Director of Medicine

Dr. Charles Dinerstein, M.D., MBA, FACS is Director of Medicine at the American Council on Science and Health. He has over 25 years of experience as a vascular surgeon.

Recent articles by this author: