Secondary Logo

Journal Logo

Viewpoint

Fake Science and Shared Decision-Making

Mosley, Mark, MD

doi: 10.1097/01.EEM.0000559985.82596.3b
Viewpoint
Figure

Figure

The thing that has disturbed me most during my career is how bright, capable, reasonable people can embrace, almost religiously, unfounded beliefs that can lead to dangerous results. How does a mother believe that her breasts cannot produce enough milk for her baby? Why does a patient with a post-graduate degree believe that vaccines cause autism? Or that a person without celiac disease should eat gluten-free to make himself healthier?

Why does a nurse, midlevel provider, or physician believe that homeopathy, infant massage, therapeutic touch, essential oils, or other alternative medicines offer more benefit than placebo? How do they fail to have skepticism toward an unregulated, unsafe, and extensively studied and disproven “natural” medicine market?

Why does a midlevel provider or physician who once in his career missed a diagnosis or simply heard a speaker who missed something react by ordering a particular test on everyone with a particular complaint because he thinks otherwise he will be sued or written up by administration?

We in medicine are quick to say we had a bad night because the moon was full. And how quickly did we all jump on the bandwagon that breast implants caused rheumatologic diseases resulting in the largest medical product liability settlement in history ($4 billion) in the absence of scientific proof! (Implants were later proven not to be a cause.) (Los Angeles Times Sept. 2, 1994; https://lat.ms/2AODBzs; Semin Immunopathol 2011;33[3]:287.)

Why are we so ready to say there is an opioid crisis to justify withholding opioids in normal patients with acute objective pain without even knowing the science of the opioid crisis in the United States?

These are fascinating questions we should all ask. Not one of us is immune to these errors in logical thinking. We endorse these errors because our memory is imperfect, shaped by our desires. Truth is complex. We crave understanding because it provides security. We make truth simple to understand even when it is not. Belief, for many, is a simple truth that fits our desires. We would rather defend our security blindly than to look openly at uncertainty. And we underestimate what we do not know. We underestimate chance. Science is an attempt to explain a small specific portion of uncertainty, but most of us do not reach conclusions as statisticians; we are more convinced by an emotionally powerful story.

Even when confronted with seemingly indisputable evidence or well-done scientific studies, we interpret them through our desires and beliefs in spite of the evidence (confirmation bias). Politicians already know this. This is why they rarely give statistics but instead tell a story about a person who was killed or harmed by the policy the politician wants to change. It doesn't matter if this represents the actual science or truth.

The media do the same. Why inform with numbers when you can show a picture with a caption? It is because of this that many wrongly believe poisoning is more common than tuberculosis, homicide is more common than suicide, leukemia is more common than emphysema, and that opioids kill more people than alcohol—they don't by about 18,000 for opioids to 88,000 a year for alcohol. (Thomas Kida, Don't Believe Everything You Think: The 6 Basic Mistakes We Make in Thinking. Amherst, NY: Prometheus Books, 2006.)

This type of thinking is called availability. What is reported more frequently appears to be more important. A single good story that captures the emotions makes it more available to memory. To discount any science that may disagree with our beliefs, we say silly things like “you can prove anything you want with a study.” This is often a criticism of thinking statistically (logically) because we want to hold to our way of believing emotionally. The truth is while you can make a study of anything you want and even write anything you want in the conclusion, you cannot prove anything you want.

There is a hierarchy of science: Its underbelly swims in a pool of murky, poorly done science; sometimes it steps up and crawls among small, inadequately-powered observational studies; and on rare gleaming occasions, it climbs to the summit of a large, well-controlled, seemingly unbiased study. Most good statisticians would not have a problem distinguishing the bad science from the good. The sad fact is that the majority of our studies are poor and inadequate science.

But there is good science. We send people to the moon based on good science. We plan city-wide electrical and water systems based on good science. And, yes, we devise vaccines based on good reproducible science, social media to the contrary.

What are we supposed to do with this information in the ED? First, we recognize the fear of the patient and family and express empathy. By asking what they fear most, naming it, and even allowing it its place, it loses some of its power to derail a reasonable approach.

We also should not blindly submit to patients' fear if the science does not support it. That happens every time we run tests though we doubt the patient has cancer, when we x-ray an arm though we know it's not broken, and when we run a serum to prove a patient is pregnant because she “doesn't believe” urine pregnancy tests are accurate. We are (or should be) the experts in medical statistics with common ED conditions and should communicate the chances of benefit and harm from good science in understandable terms to provide guidance.

Bring together the emotions and beliefs of the patient (values) with the understanding and practice of the physician (science). Marry the fear of uncertainty with the reassurances of statistics, and create a shared decision as well as a backup plan. (“If this doesn't work, please ...”)

We as physicians must be willing to do the same within our own mind, to take the fears of lawsuits, punishment by the hospital, dissatisfaction by the patient, bad “quality” metrics, or just the fear of being wrong, and hold that in tension with the professional ethics of following the best evidence for the sake of the patient and embracing the reassurance that the patient shares your responsibility and agrees with your plan. Your heart and your head should come together in a shared decision.

Share this article on Twitter and Facebook.

Access the links in EMN by reading this on our website, www.EM-News.com.

Comments? Write to us at emn@lww.com.

Dr. Mosleyis the medical director for student and resident education at Wesley Medical Center in Wichita, KS.

Copyright © 2019 Wolters Kluwer Health, Inc. All rights reserved.