The belief is more or less prevalent that the powers of observation so markedly developed in our predecessors have, to a large extent, become blunted in us, owing to the employment of instrumental aids to exactness, and the art of medicine consequently has always adopted them with considerable reluctance.
—Harvey Cushing in the Boston Medical Surgery Journal, 1903
In short, bedside skills have deteriorated as the available technology has evolved.
—Abraham Verghese in the New England Journal of Medicine, 2008
Lamenting lost clinical skills is possibly one of our profession's oldest pastimes, dating back centuries, if not millennia. In a recent article, Verghese1 depicted medical residents examining patients from “bunkers” filled with glowing monitors and lab readouts, long ago having lost their bedside skills and the “joy, excitement, intellectual pleasure, pride, disappointment, and lessons in humility that trainees might experience by learning from the real patient's body examined at the bedside.” He recommends a return to the bedside and a renewed emphasis on physical diagnosis skills in medical education and training.
It is instructive to consider just how old this genre is. Well over a century ago, the American physician S. Weir Mitchell2 expressed his concern that physicians risked becoming “dementalized” by overreliance on their “instrumental aids,” and he lamented the days when “the erudite touch was more uniquely advantageous than it is to-day.” Mitchell expressed this fear in 1891, five years before Roentgen introduced his X-ray.
The basic elements of our physical exam—including percussion and auscultation—date back to the time of Hippocrates. Galen, practicing in the second century ad, was a firm believer in accurate observation and practical skills. However, over the ensuing centuries, this emphasis disappeared, and observation was generally reduced to reading what Galen had written. The European Renaissance brought a renewed emphasis on experience and observation, yet then-prevailing medical theory and social mores left little room for physical examination. A confluence of events centered on Paris at the turn of the 19th century (the rise of pathologic anatomy and hospital medicine, the Industrial and French Revolutions, and the Napoleonic Wars) culminated in René Laennec famously rolling up a notebook of paper in 1916, calling it a stethoscope, and giving birth to the modern era of physical diagnosis.
And we have been plagued by technology dependence and decaying clinical skills ever since. For almost 200 years, there has appeared a litany of laments about reliance on gadgets and gimmicks, idle toys and blunted powers of observation, poorly trained residents and the deterioration of the doctor–patient relationship.
Medicine and Technology: A Love/Hate Relationship
With the generally uncritical and rapid uptake of new technology by the medical profession—nowhere so much as in the United States—it seems odd to suggest that we are uncomfortable with our instrumental aids. But much like the society in which it is embedded, our profession is deeply ambivalent about its technology. We are both enamored of our new devices and resistant to change. Laennec himself didn't see much use in the new technique of pulse counting, and Pierre Louis, the father of the “numerical method,” didn't see much use for Laennec's stethoscope. Today, at the same time that we marvel at the miracles of modern technology, we talk of technological imperatives and technology run amok. It was, after all, Eric Cassell,3 the physician, writing in the late 20th century, not Mary Shelley, the Romantic, writing in the early 19th century, who observed that, like the sorcerer's broom, “technologies comes into being to serve the purposes of their users, but ultimately their users redefine their own goals in terms of the technology.”
A recurring image—like Verghese's “bunker”—has been that of the distancing or alienation of doctor from patient that has resulted from the imposition of modern technology between the two. But what distance could be greater than those first few inches between Laennec's ear and the buxom woman, whose chest allegedly stimulated his invention, forever changing medical practice? When Harvey Cushing brought back his sphygmomanometer from Italy in 1901, it met with a mixed reaction. Just as contemporary authors voice concern about imaging technologies like MRI and PET scans, physicians at the time felt that Cushing's device would “intervene between patient and doctor” and “dehumanize the practice of medicine.”4
Concerns about technology are not new. Nor are they unwarranted. It is generally agreed that new technology—imaging technology in particular—is a primary cause of ever-increasing health care expenditures. And it is not just about costs; computed tomography (CT), for example, imparts radiation doses hundreds of times those of standard radiographs. Some have estimated that radiation from CT may come to account for 1.5% to 2% of all cancers in the United States.5
How to extricate ourselves from the “problem of technology” is the question. Is a renewed emphasis on the physical exam the answer? More time at the bedside with master clinicians, percussing out liver spans and palpating spleens? It is a laudable goal. But a decade into the 21st century, the era of handheld ultrasound fast upon us, is it a realistic one?
It is perhaps worth noting that the exam skills of even today's most seasoned examiner pale in comparison with those of earlier eras. Laennec,6 for example, described bronchophony, pectoriloquy, and egophony, terms still in current use. However, he also described the moist crepitus rhonchus, the dry crepitus rhonchus, the dry sonorous rhonchus, and the dry sibilous rhonchus—terms, and sounds, that few examiners would recognize today. Likewise, while many of us still percuss for resonance, few distinguish vesiculotympanitic resonance from cracked-metal resonance, as did Austin Flint7 later in the 19th century.
Teaching Future Physicians About Technology
In an essay entitled “Technology and the eclipse of individualism in medicine,” Stanley Joel Reiser8 described how specialization and technology were removing clinical judgment from individual physicians' hands. As an antidote, he advocated the systematic training of students and residents in the use of technology. A quarter-century later, though, students and trainees still receive little formal training in technology's use and application. Although most do learn about diagnostic test interpretation, screening, and reading randomized trials, they receive little training, for example, in comprehensive health technology assessment—addressing issues such as a given technology's economic, organizational, or societal impact; the role that noncognitive factors play in technology acceptance and use; the role of market forces, promotional pressures, and financial incentives; or the role of government and regulation. In fact, of the 58 topic areas surveyed in the Liaison Committee for Graduate Medical Education's 2008 medical school questionnaire,9 the word “technology” does not appear.
At the same time, in the absence of a formal curriculum, irrational and inappropriate practices are passed down informally from faculty and peers to trainees and students. In this way, future physicians learn to use technology “defensively,” to alleviate anxiety about uncertainty and ambiguity, to satisfy the “technological imperative,” and to satisfy the demands of their patients and perhaps their attending physicians.
Although there is admittedly little room for more content in the already overcrowded undergraduate and graduate curricula, few areas could be considered more important for the training of future physicians. Indeed, there can be little hope of improving the quality of patient care, reducing medical error and harm to patients, or reforming our health care system without a better understanding and application of technology by its practitioners.
And physical diagnosis? Because the clinical exam is technology, it forms part of the curriculum. But it is just a part. Like other technologies, it should be taught systematically, evaluated rigorously, and assessed for cost-effectiveness and social utility. Should we spend more time at the bedside? Certainly. I, for one, couldn't imagine rounding from the bunker. But, as Wachter10 advises, we should spend this time not divining for ascitic fluid (ultrasound is better) but, instead, talking to our patients. Taking a history. Discussing prognosis. These skills, he notes, won't soon go out of style, and they “highlight the patient-as-person and physician-as-humanist more than sticking a tuning fork on a forehead ever could.”
Students and trainees must learn to use all technology appropriately and effectively. They do need to be able to distinguish a distended neck vein from a carotid artery. But they will also need to know when—and when not—to order an echocardiogram or CT angiogram. By better training future physicians in the use and assessment of technology, we will be better equipped to remedy the ills facing our patients as well as our health care system.
In short, it's time to put down our pleximeters and move on.
The author wishes to thank James Goodman for his helpful comments.