Is being a doctor even important anymore? Does that matter? The picture of the beaten face of David Dao, MD, was plastered all over social media earlier this year. In what famously led to a truly mismanaged nightmare for United Airlines, the doctor refused to give up his seat on a flight that was overbooked. His reason: He was a doctor, and he needed to see patients the next day.
The social media commentary was quite insightful. After wading through the torches and pitchforks condemning the airline's abuse of the passenger, there was an underlying sentiment that I found very interesting. Some questioned why his status as a doctor was relevant at all, and called for a change to the headlines. In fact, some dismissed his intention to sue the airline because “he's a doctor; he's rich already.”
You should always take the comments on Facebook and Twitter with a truckload serving of salt, but reading through the polarizing and fringe remarks can sometimes reveal the hidden sentiment in reality. My ED badge, for instance, doesn't say “doctor” or “physician” under my name; it says “provider.” The doctor's parking lot at my hospital is being demolished to build a new wing of the hospital. National Nurses Week is celebrated with events and catering, while we got a discount on our lunch for National Doctor's Day. Banks are removing the doctor's loan clause from their mortgage offers, and being a doctor doesn't get you out of jury duty anymore.
These are only some of the various strings of evidence pointing to a shift in national consciousness. In fact, the incumbent Surgeon General is not a doctor but a nurse who completed a doctorate. This paradigm shift from doctors as glorified deities of society to just one of many public servants can be jarring for those who spent half their lives sacrificing themselves for the promise of god-like prestige. I believe, however, that this is a blessing in disguise.
For far too long, doctors were considered high-class aristocrats, handsome male demigods who were to be respected and revered. Needless to say, this is quite an alluring incentive for many to pursue this career at a young age. Many immigrant families tell their children to work hard, use their intellect, and forge a path to achieve the ultimate goal: becoming a doctor. There was supposed to be a prize at the end of the road, a glorified ascension to the upper echelons of society.
But here we are. In many Americans' eyes, we make way too much money, we're part of a big conspiracy to poison everyone with vaccines, and we're abusive to nurses (although I've largely found the opposite to be true). It's a rude awakening, a sort of fall from grace for the profession. I've talked to many older physicians. It's not the decreasing pay, the increasing corporate mandates, or even the technological medical record boom that is making them leave medicine. It's the lack of respect, the fading prestige.
I say we evolve and learn from our mistakes. We, as a profession, created this public image of ourselves as corrupt and arrogant, flaunting our big houses and nice cars for the public to see. I predict we're headed for the other side of the pendulum in the near future. Perhaps that's the distillation the medical profession sorely needed. Fewer parents are telling their children to become doctors. The money and prestige just aren't there anymore. Perhaps we can refocus and paint a new public image for ourselves, one where we will be regarded less for our social status and more so for our social worth.Copyright © 2017 Wolters Kluwer Health, Inc. All rights reserved.