In 2006, Michael Mastromarino, owner of Biomedical Tissue Services Ltd., was charged with stealing and selling body parts from 1,077 cadavers, including the bones of the late Masterpiece Theatre host Alistair Cooke. Extensive publicity ensued, and more than a dozen bodies exhumed for investigation were found to have plastic tubes replacing their missing bones. Newspapers have since reported cases of hepatitis C and HIV transmitted to transplant recipients through the stolen tissue, and hundreds of related lawsuits across the country are pending.1 Although this is not the first scandal involving body donation, it may be the biggest, and its potential repercussions are enormous. Willed-body programs have only recently become stable enough to supply America's medical schools with cadavers, and their success requires continuing trust from the public. The negative publicity surrounding the medical establishment's disposition of dead bodies has schools justifiably concerned.
Most American medical schools consider human dissection essential to the physician's education. Yet, acquiring cadavers for this purpose has never been simple. In the 18th and early 19th centuries, only the bodies of executed criminals were legally available for dissection, with the dissection itself being seen as an added punishment for the deceased.2 Because executed criminals were too few to supply America's dissecting rooms, students usually dissected bodies that had been illegally acquired—often stolen from fresh graves. This practice of grave robbing and its cultural ramifications have been thoroughly studied.2–4
Less studied, however, is the history of how bodies were procured for dissection in 20th-century America. By the early 1900s, nearly all dissected cadavers were unclaimed bodies made available to medical schools through state statutes. Since World War II, however, increasing numbers of cadavers have come from personal donations. Today, donations are virtually the sole source of America's cadaver supply.
The shift from using stolen to unclaimed to donated bodies occurred within less than a century, a speed that suggests strong social forces at work. During the 20th century, the United States underwent a population boom, expansion of government, new legislation, changes in the population demographics, developments in science, and proliferation of mass media, all of which affected body acquisition. Because the way society deals with death reflects the way its citizens live, these cultural shifts inevitably changed Americans' rituals of death. Furthermore, dissection is a practice laden with legal, ethical, and emotional charge, so change in its practice often was—and still is—complex. By examining the forces that affected the history of body bequeathal, we can appreciate how America has become not simply tolerant but encouraging of body donation.
Acquisition Before 1900
Before the early 19th century, most disease was considered a consequence of humoral imbalance, rather than being localized in particular organs. Anatomical knowledge thus seemed to have little application in the nonsurgical treatment of disease. During the early 19th century, however, many diseases came to be understood as localized processes rather than systemic disequilibria, and as the organ pathology accompanying specific diseases became better elucidated, the perceived importance of training in anatomy also grew. By the 1850s, all reputable U.S. medical schools required students to complete a course in anatomy,3 and as enrollment increased dramatically during the 19th century, the demand for cadavers rose concurrently.4
Medical students' access to cadavers at this time, however, remained compromised; the supply of legally available bodies—those of executed criminals only—proved inadequate for training the growing numbers of medical students. Subsequently, both doctors and medical students resorted to disinterring bodies or hiring professional grave robbers. The thieves tended to raid local potters' fields (cemeteries where indigents were buried at public expense) because those cemeteries lacked precautions against theft and the risk of uproar over the pilfering of a pauper's corpse was small. The bodies dissected were usually those of the relatively powerless: African Americans, prisoners, and the poor.5 Moreover, the consequences of being discovered were usually minor, for there was no property right in the body itself. As Mary Roach notes, “Being caught in possession of a corpse's cufflinks was a crime, but being caught with a corpse itself possessed no penalty.”6 When bodies began disappearing from church cemeteries, however, the public grew enraged.
In 1788, tensions over grave robbing exploded in the New York Resurrection Riot, when medical students from King's College were allegedly seen waving a cadaver limb from the window of their dissection room at the New York Hospital. In response, New York passed its first law against grave robbing in 1789, officially describing the law as “[an] act to prevent the odious practice of digging up and removing, for the purpose of dissection, dead bodies interred in cemeteries or burial places.”4 This law did little to stop grave robbing, however, because it offered no suggestion about how medical schools might legally obtain the requisite corpses. Schools simply continued acquiring bodies illegally.
The impossibility of obtaining enough cadavers legally made legislative revision inevitable. Massachusetts became the first state to enact a solution—using unclaimed bodies for dissection—and, in 1830 and 1833, passed laws permitting dissection of the unclaimed dead. Over the ensuing decades, many other states passed their own anatomy acts following the example of Massachusetts, providing that the unclaimed bodies of people who died in public institutions—hospitals, asylums, and prisons—would go to the state's medical schools.7
Because burying unclaimed bodies had previously used the states' resources, the states as well as the medical schools seemed to benefit when those bodies were used for dissection. Few influential voices complained about what happened to unclaimed bodies, and a vocal minority made the point that dissection offered criminals and paupers the opportunity to compensate society for the burden they had caused during their lifetimes. As one Washington Post editorialist asked in 1887, “Why would those who have made war on society or have been a burden to it be permitted to say what shall be done with their remains? Why should they not be compelled to be of some use after death, having failed to be of value to the world during life?”8
Although anatomy acts seemed to encourage respect for dissection in the study of anatomy, the language of some acts suggested that dissection was still appropriate only for those of low status. The Massachusetts law stipulated that the unclaimed bodies of soldiers would not be dissected, implying that such men had already served the state and, thus, should not be subjected to dissection. Several anatomy acts forbade dissection of travelers who died away from home. Such a proscription might seem to allow for the delay expected in claiming a traveler's body, but the language of Indiana's anatomy act revealed another consideration, providing that a body could not be dissected “if the deceased was a traveler who died suddenly, unless he belonged to that class commonly known and designated as tramps.”9 These anatomy acts reaffirmed the association between dissection and destitution in America. Resurrectionists pillaged potters' fields before the anti-grave robbing legislation; after the anatomy acts, unclaimed bodies—which usually also belonged to the poor—were the ones to be dissected.
Although the anatomy acts regulated the distribution of unclaimed bodies for dissection, they did not clarify whether the body was considered personal property and therefore could be transferred by will for a specific purpose, such as instruction in anatomy. In fact, according to many state laws, the body was not considered property and thus could not be transferred by will. California explicitly established this principle in the 1900 case Enos v Snyder.10 John Enos, who died in 1898, was survived by his wife, Susie Enos, and his mistress, Rachel Snyder; both women wanted control over his remains. Enos' will bequeathed his remains to Snyder, yet the court ruled that Mrs. Enos, as the decedent's closest relative, would retain custody of her husband's corpse because the body was not property; thus, Mr. Enos had no right to will his corpse to Snyder.
Most states' legislation held that, although it was desirable to respect the wishes of the dead, the final decision about the disposition of the body rested with the family. As Maine's anatomy act of 1869 read, “If any resident of the state requests or consents that after his death his body may be delivered … for the advancement of anatomical science, it may be used for that purpose, unless some kindred or family connection makes objection.”9
William Osler, one of the fathers of modern medicine, obtained family consents for autopsies on many of his patients at Philadelphia Hospital to further both his own and his residents' education. However, Osler's often-aggressive pursuit of these consents provoked complaints people who were uncomfortable with the idea of so much dissection, and thus the hospital's trustees were forced to restrict his once-unregulated management of anatomical material. Despite Osler's lawfulness and diligent acquisition of family consents, his dissections still met resistance from the public.11
The effects of anatomy laws enacted in the late 1800s lasted into the mid-1900s. Because they legalized transfer of unclaimed bodies from state institutions to medical schools, the price of illegally obtained bodies fell, and grave robbing was no longer profitable. Thus, despite their flaws, the anatomy acts succeeded in increasing America's cadaver supply and effectively ending grave robbery.
The First Donors
Although most Americans disapproved of dissection at the turn of the 20th century, even at this time a few citizens chose to donate their bodies to science. In 1899, newspapers reported that Thomas Orne, a prosperous Maryland horse dealer, had decided to donate his body.12 Orne was likely not concerned that donation would tarnish his reputation or that he would appear unable to afford a respectable funeral. Thus, for Orne and other wealthy donors, the association between indigence and dissection did not deter bequeathal.
Stories about suicidal donors also graced the newspapers' front pages, as in the Washington Post article entitled, “She Begged to be Killed and Dissected”:
It is learned that Miss Kate A. Donovan … called upon the physician in charge of the University of Maryland and offered her body to him after death for the benefit of science. He refused to listen to her, and she then besought him to kill her then and there, and to make an autopsy of her body for the advantage of all future womankind. Getting no sympathy from the doctor, she left him and went direct to the river and drowned herself.13
Although some suicides, like Donovan, seem to have donated their bodies to give meaning to their lives, others, like John Kowinski, ostensibly donated to aid those with ailments similar to their own:
John Kowinski… leaped from the twentieth to the ninth floor of the Masonic Temple to-day… [In an excerpt from Kowinski's will, he wished that the doctors] “dissect anatomically in all parts to determine the causes of the malformation that may be found in this body.”14
Both Donovan and Kowinski, despite their likely mental illnesses, connected body donation with altruism by acknowledging that doctors could indeed learn from dissection.
Doctors themselves were also relatively frequent body donors at the turn of the 20th century, for during medical school they had observed the consequences of the body shortage. In 1912, 200 New York physicians sought to overcome prejudice against dissection by publicly pledging themselves as body donors.15
Though few people defied convention by bequeathing their bodies, changes in America's welfare legislation and better health care for the poor contributed to a decline in the supply of unclaimed bodies that began in the 1930s and worsened during the next 30 years. Before the 1930s, the supply of unclaimed bodies had remained high because poor people had difficulty paying funeral expenses. In the wake of America's Great Depression, however, the public began demanding that everyone—including the indigent—receive a decent burial. Legislators responded by providing that Social Security would defray funeral costs. Other institutions, including the Veteran's Administration, state and county welfare organizations, fraternal groups, and labor associations, followed this example. As Ralph Rohweder of the National Society for Medical Research noted, “In some cases, organizations [were] backed up six deep, all willing to pay for funerals.” By the 1960s, Social Security alone was paying $255 for funeral expenses for each of over 640,000 people annually.16
Body acquisition was thereby affected in three ways. First, the number of unclaimed bodies decreased because many who previously could not have afforded funerals now could. In this sense, the reforms were successful. Second, some state institutions that were mandated by law to distribute unclaimed bodies to state anatomy boards now retained those bodies, buried them cheaply, and claimed the burial money, thus profiting on the exchange. Third, the availability of funeral benefits created the phenomenon of the “curbstone undertaker,” in which an undertaker would convince a stranger to attest that he was a friend of the deceased, thereby giving the stranger a claim to the body. The undertaker then gave the stranger a small compensation in return for the burial fees collected from the government for burying the body. By the 1960s, most medical schools with body surpluses were located in states with legislation discouraging abuse of burial benefits. Florida, for example, did not offer cash to the families and friends of the dead but, rather, supplied free burial services, thus checking curbstone undertaking.17
In the early 20th century, prejudice against dissection remained high, and despite the few who considered body donation, the supply of unclaimed bodies dwindled between the 1930s and 1960s. The anatomy acts of the 19th and early 20th century could no longer adequately supply medical schools with cadavers.
The Funeral Industry
American funeral practices also changed substantially between the mid-19th and mid-20th centuries. As author Roul Tunley noted, the mortician's responsibilities had evolved over time:
When America was still predominantly a rural nation, funeral directors were unknown. Coffins were built by the local carpenter or bought at the country store. Funerals were held in the homes and in the churches. With the growth of cities, however, and the crowding of people into small apartments and houses where parlors were no longer large enough to accommodate coffin, family and mourning friends, the spacious funeral home made its appearance.18
But America's urbanization does not entirely explain why the funeral industry flourished when it did. Embalming, which first became popular in the mid-1800s, allowed more time to elapse between death and burial; this newly extended mourning period developed over time into the open-casket tradition. As open-casket funerals became more popular, so too did ornate coffins, lavish floral arrangements, and even makeup for the deceased, and by the 1960s, American funerals had become elaborate. The idea that embalming catalyzed the funeral industry's boom is suggested by the fact that embalming was popular predominantly in the United States and Canada, which in 1960 were the two countries with the highest funeral costs in the world.19
Americans became aware of corrupt funeral industry practices and were shown how excessive America's ritual of death had become predominantly through three works of journalism. The first was an article, “The High Cost of Dying,” written in 1951 by Bill Davidson19 for Collier's. “All across the nation,” he wrote, “there is a rising revulsion against members of the funeral industry guilty of ‘Profiteering in Sorrow.'” He noted that in the previous 122 years, as the cost of living in America had gone up 347%, the cost of dying had gone up 10,000%. Davidson also asserted that the funeral industry had some of the country's strongest yet most inconspicuous lobbyists, which perhaps explains why “The High Cost of Dying” didn't cause more of a nationwide stir.
The funeral industry remained relatively unscathed for the next decade until another incendiary piece, an article entitled “Can You Afford to Die?” appeared in the Saturday Evening Post in 1961. The author Roul Tunley18 observed that America's annual funeral costs were far greater than the country's annual hospital costs. Neil Brown, president of the San Francisco Funeral Directors' Association, was quoted as saying, “If we Americans have a high standard of living, we also have a high standard of dying…. It's the American way.”
But the most powerful blow dealt to the funeral industry in the 1960s was Jessica Mitford's20 book, The American Way of Death. Aside from compiling her own astonishing statistics and describing sales scams used by funeral directors, Mitford also documented the prevalence of price fixing within the funeral industry. She told the story of Nicholas Daphne, who had been one of San Francisco's most prominent funeral directors until the California Funeral Directors' Association expelled him for advertising $150 funerals when others were trying to hold their charges to a $500 minimum. Lack of competition between funeral parlors allowed prices to remain so staggeringly high throughout the industry.
Mitford proposed several ways to avoid paying exorbitant funeral costs, one of which was simple: donate your body to science. Even though on the book's publication in 1963, there was no legal property value in a body, one could still donate his or her body if next of kin did not interfere. Mitford's20 appendix, entitled “Donation of Bodies for Medical Science,” may have included the first comprehensive list of schools that would accept donated bodies. Mitford's claim to have compiled the list herself suggests that this information was not readily available to the public in 1963, and her list was a milestone in the history of body donation.
The exposés of the 1950s and 1960s not only drew attention to the high costs of conventional funerals but also increased the public's awareness of alternatives. After Davidson,19 Tunley,18 and Mitford20 publicly criticized expensive funerals, body donation grew increasingly mainstream. If fear of disrespect had once prevented some from donating, fear of being victimized by disreputable funeral practices made many reconsider.
The Uniform Anatomical Gift Act
In the early 1960s, laws regarding body donation remained a confusing collection of anatomy acts, common law, and other state statutes that had been enacted over the years.16 It is difficult to say how long this muddle would have persisted had it not been for the development of transplant surgery and the consequent rise in demand for anatomical material. In 1968, the National Conference of the Commissioners on Uniform State Laws approved the Uniform Anatomical Gift Act (UAGA), which sought to clarify ambiguities in laws governing anatomic material and, by standardizing laws among states, to make more bodies and organs available for transplants, research, and scientific education.21 UAGA changed donation by establishing the human body as property (thus refuting Enos v Snyder), so that a donor's wishes now superseded those of next of kin in court. By February 1972, 48 states had adopted UAGA, and today every state has enacted some version of it.22
Despite UAGA's successful promotion of body donation, other changes combined to make fewer full cadavers available to medical schools: increasingly frequent transplantation, the resulting demand for organs, and the focus of media attention away from willed-body programs. Once organs are removed from a body, that body is no longer suitable for educational dissection. Medicals schools agree that when viable tissues are available from a donor, people needing organs take priority over medical schools needing cadavers (A. Dalley, personal communication, November 17, 2003). Body donors and organ donors, however, differ in age—the upper age limit for organ donation is 65, whereas today the average age of cadavers is 84 (K. Petersen, personal communication, November 3, 2003). Therefore, body- and organ-donation programs rarely compete for the same bodies, though organ donation as a cause overshadows body donation in the public consciousness. People who are ineligible to donate organs may not realize that body donation remains an option. Although transplantation has been hugely successful, its success has adversely affected whole-body procurement in America.
Regional differences in donation patterns suggest that factors other than transplantation have also affected America's cadaver supply in the latter half of the 20th century. States with a high ratio of medical schools to the general population have more difficulty finding adequate anatomical material than do states with larger populations and fewer schools. More prestigious schools also tend to receive donations more readily than do less prestigious schools. Other factors that seem to influence donation include religion, culture, income, age, and education.
Despite the decrease in unclaimed bodies, the media focus on transplant surgery, and the persisting need for cadavers in many states, body donation eventually satisfied America's demand. There are several possible explanations for the stabilization of willed-body programs toward the end of the 20th century.
Changes in religious beliefs and practices may partly explain the rise in body donation. In the past, many major religions disapproved of dissection, but most have grown receptive to dissection and organ donation. Less emphasis is placed on the sacredness of the inviolate body than before, so the desire for burial after dissection dissuades fewer people from donating. In England, the rise in body bequeathal was accompanied by a concomitant rise in cremation, implying that the intact corpse became less spiritually significant.2
Dissection evokes less fear today than 50 years ago. Potential donors no longer worry that society will brand them paupers, criminals, or suicide victims if they choose to donate, and they are reassured that medical students will treat their bodies with more respect today than was true 100 years ago. Until recently, professors often encouraged students to use humor as a coping mechanism for dealing with dissection; professors now demand respect for the dead in and outside the anatomy laboratory.6 Most medical schools hold memorial services for donors in which students share their anatomy experiences with other students, professors, and sometimes even donors' families.
Today's willed-body programs are mostly doing well. Many workers in the field say that they have not seen a shortage in recent years, even though, as of 1996, America required between 12,000 and 15,000 cadavers annually for teaching and research.23 Despite this current stability, many schools have begun instituting new teaching initiatives, such as body plastination (to allow repeated use of donated material), holographic display systems (to help students visualize anatomy in three dimensions), and virtual reality computer modules (to simulate the dissection process).24 Implementation of these modalities may suggest that schools are removing dissection from the gross anatomy curriculum, but infact, very few schools have made such a drastic change, and most of those that did ultimately reinstated dissection. These newer teaching tools are considered by most to be dissection supplements rather than substitutes for dissection.25
Little information is available about donors' characteristics. Although some institutions have collected their own demographic data, there are no national studies about body donors in the United States. The Northeastern Ohio Universities College of Medicine comprehensively analyzed 1,267 donor applications submitted between 1978 and 1993 and found proportionally more women than men among potential donors, possibly because more women than men live into old age. The vast majority of applicants, and of the population in which this study was conducted, was white. The only characteristic distinguishing this group of body donors from the national population was education: 40.5% of applicants had completed high school, and 17.0% had graduated from college, compared with 36.6% and 8.5%, respectively, in this age group nationally.26
People familiar with body donation note that doctors frequently donate because they have learned firsthand the value of cadavers, and doctors are often expected not only to donate their own bodies but also to convince others to donate. Since the 1950s, California has encouraged its physicians to discuss donation with patients. Although some doctors may feel uncomfortable discussing donation, many sick or older patients may find comfort in knowing that they can be of service after death. In regions where doctors hesitate to have such discussions, cadaver shortages persist.27 Some states with body shortages have enacted required request laws, which instruct all physicians to ask families for permission to use the bodies of their recently deceased relatives. Required request laws doubled New York's organ supply in just one year; although no conclusive data show the legislation's effect on the state's cadaver supply, a similar rise is likely.28
Unfortunately, many people learn about willed-body programs not from their physicians but through highly publicized scandals. In 1977, the George Washington School of Medicine issued an apology for asking a man to pay the school to accept his wife's body.29 In 1986, Philadelphia physician Dr. Martin Spector was caught shipping body parts out of state; postal workers noticed a leaky parcel and opened it, only to discover a collection of human heads.28 In 2003, an employee of the University of Texas Medical Branch was fined $18,000 for selling fingernails and toenails from donated cadavers.30 In 2004, Henry Reid, Director of UCLA's Willed Body Program and recipient of the National Funeral Directors Mortuary Science Award, was charged with grand theft for personally selling $704,600 worth of human cadavers that had been donated to the school, and this widely publicized scandal prompted many to reconsider their previous decisions to donate.31 And, in 2007, the public still awaits the outcome of cases pending against body broker Michael Mastromarino.1
America's cultural landscape changed dramatically over the course of the 20th century; as a result, body donation has become an accepted practice. Yet, public support of donation is a relatively recent, and possibly precarious, phenomenon. As new gruesome tales of body stealing appear in the headlines, willed-body programs may face a mistrustful public.
Because dissection continues to be part of medical education, the need for bodies remains high. Yet, despite lessons of the past, abuses persist. These recent transgressions remind us that the story of American body acquisition is by no means complete.