Often Wrong, Never In Doubt

By Chuck Dinerstein, MD, MBA — Dec 03, 2019
For many years, one's family motto has been “often wrong, never in doubt.” Overconfidence is a cognitive problem, present to lesser and greater degrees in us all. And it grows in the presence of two conditions.
Image courtesy of ElisaRiva on Pixabay

Scientists studying cognition report that we are frequently overconfident when considering small possibilities – we think they are more significant than they are, at least mathematically. Overconfidence is not necessarily good or bad; a belief in a slight chance of recovery is called hope. An “abundance of caution” may prompt us to be more prudent in the face of a small, but existential risk.

Overconfidence can be particularly problematic when two conditions prevail. First, when the information we are assessing is noisy, there is some signal of truth, but it is accompanied by a degree of doubt. Second, when this same noisy informational environment provides weak feedback, that is, feedback that comes after a significant delay or that is not overly convincing. Many of the “scientific” positions that are “controversial” and attract strident polarized views often meet those two criteria. 

For an older example, consider the hundred-year history of smoking’s effect on our health. Lung cancer, long before it was the most common cause of cancer deaths, was so rare that physicians gathered around to see this odd pathology. And in the early days of the twentieth century, many other causes of death hid the rising tide of lung cancer. Additionally, smokers don’t develop lung cancer for many years after they start smoking; the feedback that smoking is harmful is an excellent example of very delayed feedback. Over the next thirty or forty years, the persistent signal of lung cancer became more evident, there was less noise, and we had longitudinal data that made the feedback stronger. Tobacco companies facing financial peril did not and could not repress the growing evidence, but they cast doubt on the conclusions, by framing the evidence as not overly convincing. By casting doubt through every available media sources, they sought to enforce the truth of their claims by shouting louder and more frequently than their opponents.

One would hope that disseminating information more broadly and cheaply would serve as a corrective; the Internet could be counted upon to reduce the distortions of noise and weak feedback. But, if anything, it has proven to be a more effective, pliable way to continue to increase the noise and spread the doubt. Searching for information on the net has been likened to drinking water from a fire hose - our first precondition for overconfidence, little signal, much noise. To use an old meme, when you use the net, no one knows you’re a dog – everyone can present themselves as an expert. And knowing that, makes every report a little more doubtful, it further weakens the feedback.

Science is, at its heart, about discovery, but the media that communicates science to us is often about advocacy. Everything is a sales pitch. The study funded by Big Tobacco, Big Climate, or Big Natural is readily identified. Still, government-funded research is pitched to what is politically fundable, and journals and foundations are pitched attention-getting results. One consequence of such a system is what Steven Colbert characterized as truthiness, our belief in something because it feels right; another way we share our overconfidence. The error for us lies not in the overconfidence, after all, that is our human behavior, it is in confusing the science of discovery with the sales pitch, and it results in us talking past one another rather than engaging in the discussion that is science.

Category

Chuck Dinerstein, MD, MBA

Director of Medicine

Dr. Charles Dinerstein, M.D., MBA, FACS is Director of Medicine at the American Council on Science and Health. He has over 25 years of experience as a vascular surgeon.

Recent articles by this author: