Musings of a Cancer Doctor
Wide-ranging views and perspective from George W. Sledge, Jr., MD
Monday, September 12, 2016
I offer, for the edification of my readers, a few thoughts I've had about cancer research, garnered from 3-plus decades in the lab and the clinic. I cannot claim originality, as some of these are flat out steals, and I suspect the rest probably have been thought of by others. But I offer them in the hope that they may be of some interest.
1. Ideas are cheap. Work is hard.
I worked with Larry Einhorn for a good part of my adult life. Early on, whenever I would go somewhere to give a talk, one of my hosts would claim that he (my host) had originated the idea of using platinum-based therapy for testicular cancer, and it was only by chance that Larry had gotten there first with his PVB regimen. Later on, I met an engineer who claimed to have come up with the idea for the personal computer before Jobs and Woz, only to be stymied by his corporate bosses. I would add that, whenever I have had a brilliant idea, I have only to open a scientific journal to find it in print. Ideas are cheap, and frequently "in the air"—at least in retrospect and in popular memory—but work is hard. PVB and
Apple were the result of a dose of luck, and some smarts, but mostly lots of hard work done at the right time.
2. It's OK to sleep with a hypothesis, but you should never marry one.
This one was shared with me by an astute, if crude, laboratory researcher; it has been attributed to Sir Peter Medawar, the Nobel laureate immunologist. We sometimes think it heroic to battle for a scientific hypothesis that everyone else disagrees with. I call this the "They said Thomas Edison was crazy" phenomenon. But my experience with the few great researchers I have known is that they are thoroughly unsentimental about their hypotheses. They try out one after another, test them as rapidly as possible, and move on if they are wrong. And even when they are right, the great ones are serious pruners, clipping off the unneeded and the extraneous until the truth is revealed in all its glory. A wise variant of this maxim comes from Darwin's bulldog, Thomas Huxley: "Another beautiful hypothesis murdered by a nasty, brutish fact."
3. A good drug makes everyone look smart.
A decade ago, melanoma doctors were considered lower life forms by breast cancer doctors. Now breast cancer doctors have serious melanoma envy. Melanoma doctors have not gotten smarter, nor have breast cancer researchers voluntarily undergone frontal lobotomies. Checkpoint inhibitors and vemurafenib came along, serving as chemoattractants for the fresh talent flooding the field. New England Journal of Medicine papers started popping up every 2 weeks. Lots of great science, too, but a good drug makes everyone look smart.
4. No one is smarter than a phase III trial.
A few years ago, I was chair of the ECOG Breast Cancer Committee. We had just completed E2103, our phase III metastatic breast cancer trial of anti-angiogenic therapy, with a doubling of progression-free survival in the metastatic setting. It being the height of the "wisdom of crowds" fad, I polled the committee, asking them to predict the results of our just-started adjuvant trial. Ninety-four percent of the attendees felt it would be a positive phase III trial. I was part of that crowd. With such an overwhelming vote, why even bother to conduct the study? Well, because…we were not as smart as a phase III trial. We seriously didn't have a clue.
The other common variant on this phenomenon occurs when the phase III trial is positive. Almost immediately, individuals start individualizing: altering doses and schedules, substituting drugs, changing duration of therapy. These alterations lead to either increased toxicity or decreased efficacy or both. Alternatively, we discount the results of the trial because it differs from our personal experience or biases. We think we are smarter than a phase III trial. Repeat after me: no one is smarter than a phase III trial.5.
5. Scientific papers are the first rough draft of scientific history.
With apologies to The Washington Post's Ben Bradlee, of Watergate fame, to whom the original version of this maxim is attributed, minus the "scientific." A great research finding is rarely the end of the story.
More often it is the middle, and sometimes only the beginning of a new trail of discovery, a jumping-off point. I have had numerous colleagues, most smarter than I am, who fell into "analysis paralysis," trying to create the perfect scientific paper, one for the ages. They usually get scooped by the guy writing "the first rough draft" of scientific history.
6. Quantity has a quality all of its own.
The scientific literature is mostly crud. This is true both in the lab and the clinic. Lack of reproducibility is endemic. One of the major reasons for this is the curse of small numbers: cell line experiments that use only one cell line (rather like treating a single patient and expecting to discover a universal truth), animal experiments with a few lonely mice, clinical trials with suspicious "descriptive" statistics. Small numbers studies are particularly prone to overcalling results, and we, lemming-like, follow them over the cliff in our ever-recurring credulity.
Last year, I heard a presentation where a new combination therapy was said to have an 89 percent response rate. Translated into English this meant that, in a preliminary, unaudited analysis, eight of nine patients had shown initial signs of response. The next time I heard the study presented (more patients, longer follow-up, external review of results) the response rate had dropped to 54 percent. Still respectable, but not the second coming of the Salk vaccine. Quantity has a quality all of its own.
7. Biology is destiny.
Having lived through over 3 decades of research, I have seen many promising therapies come and go. The ones that have stayed have almost always been biology-based. This was not obvious for the first half of my career, a period dominated by the pursuit of ever longer acronyms, ever-higher doses, and application of therapies based on "unmet medical need," a euphemism for "we can sell a lot of drugs for this bad cancer." Though "Toothpaste A + Toothpaste B" empiricism regularly rears its ugly head, the last 15 years have seen the long-overdue triumph of biology in cancer medicine. By the way, this isn't because we were stupid then and we are smart now. We have better tools, and hard-won knowledge. We're still quite capable of stupid.
8. Biology is messy.
If biology is destiny, that doesn't mean it is either easy or clean. Biology is gloriously messy. We regularly get confused. Like the drunk who uses the lamppost for support rather than illumination, we latch on to some biologic story, holding on for dear life, often long after the story is shown to be incomplete or even flat out wrong. Even with a "clean" drug like trastuzumab for HER2-positive breast cancer, we spend the good part of 2 decades arguing over exactly how it works. This doesn't bother me a bit. I love messy. We're cancer biologists, not physicists, after all. Not everything has a single, simple explanation. Remember Walt Whitman: "Do I contradict myself? Very well, then I contradict myself, I am large, I contain multitudes."
9. Knowledge isn't power.
I know, Francis Bacon is rolling over in his grave right now. But knowing isn't enough unless you know enough. We've known that KRAS is really important in pancreatic cancer for decades, and as far as I can tell it hasn't improved anyone's overall survival, because we haven't been able to shut KRAS down outside the laboratory. Knowledge, let me repeat, is not power: successful application of knowledge is power. Take BRAF in melanoma. Remember sorafenib? The drug that was going to change the face of metastatic melanoma? But vemurafenib isn't half bad (well, only 49% bad, since it doesn't work all that long, but who's counting?). Or PARP inhibition for breast and ovarian cancer: remember BiPar's iniparib, which wasn't really a PARP inhibitor? Bad tools won't test the hypothesis, but that doesn't mean the hypothesis is wrong. Another Bacon quote is preferable: "Nature to be commanded must be obeyed."
10. Nice drugs get beat up in bad neighborhoods.
I cannot tell you how many times, over the years, I went to a cooperative group meeting, or an advisory board, and heard a proposal to try a new drug in fifth-line therapy for this disease, or the same drug in that disease, where absolutely nothing ever works. In the former case, the usual explanation is that we have all of these (lousy, inadequate, toxic, borderline useless FDA-approved historical accidents of) drugs that we have to try first. In the latter case, "unmet medical need" is the excuse for hoping that a drug will magically transform a disease for no other reason than we want it to do so. Hope is not a strategy. Hard targets are hard for a reason. Nice drugs get beat up in bad neighborhoods.
I could go on, but 10 is a nice number, as Moses understood. And I've run out of fingers.
Friday, July 8, 2016
One of the more amazing results of the genomic revolution has been the window it opened on our past. It is only a few years since the discovery of Neanderthal genes in the modern Eurasian genome, as well as the mysterious Denisovan genes present in modern Melanesian islanders. For most of my life, Neanderthals were hulking cave men, long extinct, their very name a term of abuse. Now, every morning when I shave, a Neanderthal face stares back at me from the mirror. My lack of photogenicity finally has a rational explanation.
But it turns out there is a great deal more. The last year has seen an explosion in genomic studies illuminating our past, all quite fascinating, and certainly unexpected.
Let's start with an obvious Charles Darwin sort of question. Neanderthals and our main line ancestors split apart half a million years or so ago. They met again on several occasions, and those happy reunions resulted in our current 2 percent or so of Neanderthalism. The 2 percent, by the way, appears to be down from 6 percent back in the day, if we look at ancient human genomes.
Are our legacy Neanderthal genes randomly distributed throughout our genome, or have they been selected for? No question, and no surprise: selected for.
A History of Genes
Let's start with the Y chromosome, that puny little thing, that living rebuke to Freud's concept of penis envy. It has an important tale to tell, for like the mitochondrial genome in women, it provides an easy gender-specific snapshot for men.
The modern human genome has no Neanderthal Y chromosome genes. None whatsoever, as far as we can observe. They were eliminated. This could represent random loss over time, but a recent deep dive into the Neanderthal Y chromosome suggests a probable culprit for this ethnic cleansing. The Neanderthal Y chromosome contains three minor histocompatibility antigens capable of triggering an immune response known to contribute to miscarriages.
This work adds to earlier data suggesting so-called "genomic deserts" for Neanderthal genes on the X chromosome and for genes governing testicular function, painting a picture of what is known as "hybrid infertility." Humans and Neanderthals lived right on the edge when it came to prospects for interbreeding.
What did we get from our Neanderthal ancestors? A nasty bunch of disease predispositions, for one thing. If one looks at single nucleotide polymorphisms (SNPs) derived from them, and then crosses those SNPs with the electronic health records of 28,000 adults of European descent, one sees sizeable increased risks for depression, actinic keratosis, hypercoagulable states, and tobacco addiction. I have this weird vision of a moody, cigar-smoking, warfarin-ingesting, bearskin-clad hulk with a bad complexion hunting wooly mammoths on the Eurasian steppes.
By the way, the paper that showed these things is quite wonderful, linking the electronic health records of 28,000 adults across numerous medical institutions to large SNP databases for those individuals to the Neanderthal genome. It is CancerLINQ for Neanderthals. The ready transportability of health data and its cross-linking with genome datasets for research purposes suggests they were not using the dominant electronic health record system found in the San Francisco Bay area.
On the plus side, Neanderthal genes appear to provide us protection against parasites. Our modern genome gets several members of the Toll-Like Receptor family (TLR1, TLR6 and TLR9) from our Neanderthal ancestors, and these immune receptors are valuable activators of our adaptive immune responses. There's a growing literature on Toll-Like Receptor's roles in antitumor immunity, so perhaps one day soon we'll enroll Neanderthal cell surface receptors in the fight against cancer. Neanderthal genes also affect our skin: if someone says you are "thick-skinned," your Neanderthal traits are being praised.
Speaking of the Y chromosome, one recent study looked at the modern Y chromosome in 1,200 men from 26 modern populations. All living men descend from a man living 190,000 years ago, the male equivalent of "mitochondrial Eve." This is unsurprising. What was of interest was evidence for male population explosions on five continents at different times over the past 100,000 years, with rapid expansion of specific male lineages in just a few generations.
Half of all men of Western European origin, for instance, descend from just one guy living 4,000 years ago. The authors of the Nature Genetics paper tried to imagine why this might have occurred: introduction of a new technology related to the agricultural revolution, perhaps. I'm doubtful that my ancestor proto-George (from the Greek Γεωργιος (Georgios) which was derived from γεωργος (georgos) meaning "farmer, earthworker") was some peaceful, humble farm boy who invented the plow. Far more likely that he was some prolific thug with a gang, or whatever passed for a gang in 2,000 BC in France or Germany. Genghis Khan before Genghis Khan, lost to history but remembered in our genes.
Tracking Viral DNA
Neanderthals aren't the only interlopers in the modern human genome. Let's look at our viral genome. One of the great discoveries associated with the Human Genome Project was the large amount of viral DNA incorporated into human DNA, fully 8 percent or the whole.
For quite some time, this book was shelved under "Junk DNA" in the genome library. But no more. Work published in Science shows that the viral DNA (from so-called "endogenous viruses") is not randomly inserted in the human genome. Rather, it clusters, and clusters around those parts of the genome associated with immune function. Here's where the story gets really interesting. Our genomes appear to have co-opted the viral invaders for the purpose of—you guessed it—fighting off other viruses.
When the authors of the paper used CRISPR/Cas9 technology to snip out endogenous viruses close to the AIM2 immune gene, it no longer responded effectively to the interferon "alarm signal" released in response to a viral infection. Immune function suffered. Endogenous viruses are the family dog who protects villagers from marauding wolves, barking loudly whenever their genetic cousins come near.
I find this thought pleasant, and somehow comforting: we've been assimilating unwanted immigrants into our chromosomal society far longer than humans have existed. Not only are endogenous viruses not "junk DNA", they are valuable, well-behaved, even life-saving first responders.
What really amazes about all this is how much we learn about our history sitting in a laboratory. Our DNA is a time machine, traveling to places no historian can reach. Dig up some old bones from a cave in Spain or Siberia and wonderful mysteries are revealed. Those old bones still matter, eons after our ancestors shuffled off this mortal plane.
One last look at the Neanderthals before we go. Trying to get inside the heads of extinct ancestors is a dangerous enterprise, but enjoyable nevertheless. Recently, far inside Bruniquel Cave in southwestern France, investigators came across a fascinating human structure. The cave's occupants had broken off some 400 stalagmites and then rearranged them in a circle. These circles show signs of fire use, with soot-blackened, heat-fractured calcite and bone remnants. Using uranium-thorium dating techniques, the circles date out to 176,500 years, long before Homo sapiens entered Europe. Neanderthal spelunkers created the structures.
What were they doing there? Did the stalagmite circles have some ritual significance, some religious meaning? Or did the Neanderthals, like modern humans gathered around a flat screen TV, just like to have a family room where they could kick back and enjoy themselves after a long day hunting bison and woolly mammoths? We'll never know, but it makes me think we might have understood each other.
Friday, May 27, 2016
One of the delights of science is how captivating the results can be. About 1.3 billion years ago, and incredibly far away, two black holes spiraled together, collided at half the speed of light, coalesced into a single black hole, and in that joining three suns worth of mass were turned into energy in the form of gravitational waves. In September of last year the sound of that collision (literally a sound: you can hear it at https://www.youtube.com/watch?v=TWqhUANNFXw) was recorded by the Advanced Laser Interferometer Gravitational-Wave Observatory's (LIGO) parallel scientific instruments in Washington and Louisiana. The waves reached Louisiana 7 milliseconds before Washington, suggesting a location in the Southern hemisphere.
The result, predicted 100 years ago by Albert Einstein, simultaneously proves the existence of black holes and launches gravitational astronomy as a new field of scientific endeavor.
Gravity is both incredibly weak and impressively strong. It is the weakest of the four fundamental physical forces in nature, some 38 orders of magnitude weaker than the strong force, 36 orders of magnitude weaker than the electromagnetic force, and 29 orders of magnitude weaker than the weak force. But at the macroscopic scale it dominates, affecting the trajectory of heavenly bodies and of humans walking around here on earth. Try jumping into the sky and see how far you get
What we know about gravity results largely from the scientific work of the two smartest humans who ever lived, Isaac Newton and Albert Einstein. Newton discovered the inverse-square law of universal gravitation, summarized in this equation:
Put another way, Newton's law of universal gravitation states (and here I quote Wikipedia) that any two bodies in the universe attract each other with a force that is directly proportional to the product of their masses and inversely proportional to the square of the distance between them.
Before Newton, gravity meant something different: something serious. But the term derives from the Latin "gravitas," which means weight, or heaviness, and this was the meaning Newton called upon. Newton's law does quite well for predicting most of what we experience, and its application to our solar system led to the discovery of Neptune in the 19th century.
This worked well enough for two centuries, until astronomers found it could not explain the orbit of Mercury. This discrepancy between theory and fact led Einstein to think about gravity, and to his field equations, principle among them:
I do not remember any of this from my sole semester of college physics in the early 1970s, but Wikipedia assures me these are true equations. I do remember the thought experiments that represented the beginning of the two great theories attached to these great discoveries: Newton's apple, Einstein's elevator.
One important consequence predicted by Einstein's equations is the existence of gravity waves, caused by distortions in the space-time continuum around large masses. Einstein himself was uncertain that such waves would ever be detected, or even whether they existed. But now, thanks to LIGO, we are certain they are real.
Gravity & Health
I don't usually think about gravity when I see my patients in clinic, though like many things I don't think about, gravity is always there in the background. It weighs us down.
Recently, astronaut Kelly Scott came back from a year spent at the International Space Station some two inches taller than when he had left. His spinal disks, not as weighed down, expanded, though they reverted as gravity reasserted itself. Scott is one of a pair of twins, allowing NASA scientists a mini-randomized controlled trial of low-G effects on human physiology.
Scott might end up shorter than when he left Earth, and certainly weaker. Astronauts lose muscle mass and bone density. Spend 6 months off-planet and you will lose 10 percent of your bone mass. Even the hearts shrink a bit while in orbit, leaving the returning astronaut faint with exertion. We evolved living at the bottom of a gravity well. Sit on top of the well and molecular adjustments occur. NASA worries about these things, as it would like to send astronauts to Mars some day.
Low gravity's molecular effects aren't perfectly well understood. That old biological workhorse (workworm?) Caenorhabditis elegans has been compared on ground and in space. One study saw down-regulation of numerous genes, whose cumulative effects prolong C. elegans lifespan. Another saw decreased accumulation of toxic protein aggregates in ageing worms' muscles. Space was made for worms searching for immortality.
There is a fascinating plastic surgery literature on gravity. Here's a fun little experiment reported by two groups (Aesthet Surg J 2014;34:809-822, and Int J Cosmet Sci 2015;37:291-7). Take patients who show signs of midface aging and have them either stand up or lie down. Take pictures while smiling and in repose. Compare "brow position, ear trough length and depth, steatoblepharon, cheek volume, malar bags/festoons, and nasolabial folds." (Steatoblepharon, by the way, is "the prolapse of fat from the orbit of the eye below the eyelid." A festoon is a chain or garland of flowers, hung in a curve as a decoration, unless you are a plastic surgeon, when it refers to significantly damaged skin in the lower eyelids.)
Then, because perception is as important as measurement if you are a plastic surgeon, have someone estimate their age by looking at the pictures. What happens?
The supine patients measure younger, and look younger. Indeed, if you are over the age of 55, and you want to look younger, never get out of bed. The simple act of lying down makes you look six years younger, on average. Gravity redistributes interstitial fluids, subtly contorting our faces. Remember that the next time you visit a funeral home and hear the magical words "doesn't he look natural?" No, he doesn't. It's just gravity. We sag when we stand. I'd rather sag, thank you very much, as long as I can keep standing.
This redistribution of interstitial fluids occurs in space as well. Fluids no longer pool in the legs, but in the upper body. The body's blood volume sensors are in the upper body, so red blood cell production declines. Anemia is yet another reason to faint when you get back to Earth.
Cancer in Space
Gravity is a boon to plastic surgeons. What about cancer?
When I treat a postmenopausal breast cancer patient with aromatase inhibitor therapy, I routinely discuss the bone loss that results from estrogen deprivation. How does one prevent bone loss? I tell my patients bone is laid down along lines of stress, and that weight-bearing exercise is among the best things they can do to prevent osteoporotic fractures. Though I never mention gravity, I silently invoke its inexorable force.
NASA has spent some small fortune performing experiments with cancer cells in space (examined in Nature Reviews Cancer 2013;13:315-27). Under microgravity conditions, cancer cells form spheroids, sometimes quite large: in one case, golf ball-size organoids of prostate cancer cells that far outpaced their earthbound cousins. Gliomas, in contrast, roll over and die in space, for reasons no one understands.
Virtually all cancer cell types undergo major structural changes in microgravity. Tubulin is assembled into microtubules by gravity-dependent reaction-diffusion processes: get rid of gravity, and the microtubules change their orientation. Would Taxol work differently in space?
So far, I've seen no rush to set up an orbital outpatient oncology clinic. Astronauts do not appear to have a higher rate of cancer incidence or mortality than the rest of us, though the numbers are small and the comparisons highly confounded from a statistical standpoint. But there is, apparently, a thriving trade in pharma-driven space experiments, with a start-up company called Space Tango contracting with around 50 customers this year to perform contract microgravity experiments in tissue-box-size automated labs.
Meanwhile, back on Earth, my ears are peeled for the next echo of two black holes colliding in some distant galaxy. Can you hear it? Let's listen together.
Monday, January 11, 2016
Recently I learned of the death of Frances Kelsey. I was somewhat surprised, not that she had died, but in the “I didn’t know she had lived that long” sort of way one has sometimes in reading the obits. Kelsey had become a national figure when I was in my early teens, at a time when John Kennedy was president, and I had not given her a second’s thought in decades.
Kelsey’s story is inextricably linked to that of thalidomide. It is an old story, and its outlines are familiar to most, but let me share it with you in case you missed it or misplaced the memory. In the 1950s thalidomide was developed as a sleeping pill and was used as such in Europe. It was particularly valued for its benefits in pregnant women. Rights to thalidomide had been sold in the United States to the William S. Merrell Company.
FRANCES KELSEY, MD, PHD
Kelsey was, at the time, a brand-new staffer at the Food and Drug Administration. Prior to this she had performed research at the University of Chicago, where she had earned both a PhD and an MD degree. When she applied to the PhD program at the University of Chicago, the acceptance letter was addressed to “Mr. Oldham” (she had been born Frances Oldham). “When a woman took a job in those days, she was made to feel as if she was depriving a man of the ability to support his wife and child,” she later told an interviewer. Fortunately, her McGill University thesis professor said, “Don’t be stupid. Accept the job, sign your name and put “Miss” in brackets afterward.” She did.
The First Task
Arriving in Chicago in 1936, she was set to work identifying the toxic agent in a batch of sulfanilamide. Initially entering the market in tablet form, the antimicrobial had been converted to liquid form for intravenous use. This new compound had promptly killed 107 patients, many of them children. The modern FDA is a stickler regarding drug formulation, and this dates to the antifreeze-like industrial chemical that had been included in the intravenous sulfa. In 1938 Congress passed the Food, Drug and Cosmetic Act in response to the tragedy.
As a new FDA staffer, Kelsey was tasked with evaluating thalidomide. She would later say that she had been assigned the evaluation not because it was considered a difficult case, but because it was a simple one—a drug already on the market, with thousands of patient-years of experience…a slam-dunk approval, a no-brainer.
Except that Kelsey was meticulous, and had a brain. In pouring through the documentation provided by the company, she was impressed by the lack of information provided by Merrell. The clinical data supplied involved little more than doctor testimonials, which Kelsey later characterized as “too glowing for the support in the way of clinical back-up.”
While the evaluation was ongoing, she noticed a publication in the British Medical Journal suggesting that thalidomide was associated with peripheral neuropathy. Merrell’s packet to the FDA had failed to mention peripheral neuropathy, immediately alerting Kelsey that something might be wrong. In those days, astonishingly, drugs were automatically approved by a certain date unless the FDA specifically disapproved them. Such approval by default seems amazing today, but those were different times. Then, as now, no one at the FDA wanted to disapprove a drug without sufficient cause.
The out was that the FDA could put an approval on hold if it requested additional information. Kelsey invoked this rule. The Merrell representative, after much grumbling, admitted that yes, there had been some reports of peripheral neuropathy, but the company felt the drug was not to blame, that it was probably related to poor nutrition, and it wasn’t all that big a deal in any event.
Kelsey still demurred and demanded real data. She and her colleagues at the FDA were particularly interested about the effects of the drug on the fetus, given that a pregnant woman might take the drug for months. Merrell reported that it did not know of any problems with the drug in pregnancy, but had not conducted a study. They were anxious to get the drug on the market, and hounded her for a quick approval. They also pressured her superiors and threatened to go to the Commissioner of the FDA. Kelsey stood firm.
But peripheral neuropathy was the least of thalidomide’s problems. While Kelsey awaited more data, reports began to emerge from Europe tying the agent to birth defects in pregnant women. The famous Hopkins cardiologist Dr. Helen Taussig (the heroes in this story are women) heard of the issue and traveled to Europe to investigate. Babies there were being born, in the thousands, with flipper-like arms and legs, a condition known as phocomelia. The drug also significantly increased the rate of miscarriage. When Taussig came home she met with Kelsey, and later testified about thalidomide in front of a House committee considering strengthening drug approval laws.
In 1962 then-president John F. Kennedy awarded Kelsey the President's Award for Distinguished Federal Civilian Service. At the award ceremony he testified, “Her exceptional judgment in evaluating a new drug for safety for human use has prevented a major tragedy of birth deformities in the United States.” Kelsey, ever modest, consistently maintained that she received the award on behalf of a team. She did not think she had done anything exceptional.
More to the Story
Kelsey’s story did not end there, though that is most anyone has ever heard of her. The outcry over thalidomide, once the story broke, led to passage of the Kefauver-Harris Amendment in 1962 to strengthen drug regulations. The bill had languished in Congress for six years prior to the thalidomide revelations.
To get some sense of how different a time 1962 was from today bear in mind that at the Senate hearing for the amendment, Senator Jacob Javits could ask the question, “Do people know they are getting investigational drugs?” The answer was “no.” Many of the U.S. participants in thalidomide trials were unaware they were receiving a drug that had not been approved by the FDA. Thalidomide changed that. The requirement for informed consent dates to Frances Kelsey. She became the first chief of the Investigational Drug Branch and created the modern process for new drug testing.
Fifteen years ago I sat on the FDA’s Oncology Drugs Advisory Committee, or ODAC. There I developed a deep and abiding respect for FDA staffers as dedicated public servants with an essentially thankless job. If the FDA approves a drug and later has to pull it from the market for reasons of some unexpected and rare but serious toxicity (think: COX-2 inhibitors), they are pilloried for not having done due diligence during the regulatory process and are accused of being stooges for the pharmaceutical industry. If, on the other hand, they delay drug approval, or reject a drug, the Wall Street Journal publishes an editorial titled—and this actually happened—“FDA to Patients: Drop Dead!” Basically, they cannot win.
I have had my own gripes with them over the years: lack of consistency and (on occasion) a seemingly poor understanding of biology. But that does nothing to diminish my respect for the FDA medical officers I have met, serious professionals who could be earning multiples of their government salary by crossing the street into industry. They are proud of Kelsey’s legacy, as they should be.
I suspect—though I do not know this to be the case—that they may sometimes invoke Kelsey’s memory to justify obstructionist behavior, just as some newspaper reporters consider themselves to be Woodward and Bernstein’s progeny whenever they publish some minor expose. But I would far rather live in a world where public servants protect the public good, just as I want a world full of investigative journalists to counter the influence of the rich and powerful.
Frances Kelsey, a tireless public servant, worked until she was 90 and died at the age of 101. She had two children; accurate enough a count if one omits the thousands of babies born with two arms and two legs because of her courage.
Tuesday, December 15, 2015
Nobel Prize season finished last week with the Awards Ceremony on Thursday, and once again the Swedes have failed to recognize my many contributions to peace, medicine, literature or physics (OK, the last one is a bit of a stretch). I watched my cell phone for days, but no phone calls from country code +46. What is with those guys?
But as happens every year, a Nobel Prize has an interesting story, and a back-story behind that story. And this story has to do with malaria and war.
The prize for Physiology or Medicine (it's quaint name) went to three investigators, with the common theme of tropical disease. Satoshi Ōmura and William C. Campbell received half of the prize "for their discoveries concerning a novel therapy against infections caused by roundworm parasites" and the delightfully named Dr. You-You Tu received the other half for her discovery of the anti-malarial drug artemisinin.
It was the malaria story that caught my attention. Dr. Tu received the prize for the discovery of artemisinin. During the Vietnam War, our Vietnamese opponents suffered horrendously from malaria. Chairman Mao, a fan of traditional Chinese medicine, encouraged studies of Chinese herbs as treatment for the disease. Dr. Tu examined the Chinese herbal tradition and found that compounds derived from the sweet wormwood plant could treat the disease in mice. She then isolated artemisinin from the herb, and an exceptionally important antimalarial was born. Today over half of malaria worldwide is chloroquine-resistant. Artemisinin is the lifeline for patients suffering from resistant malaria.
As an undergraduate I had a classmate who had served as an infantryman in Vietnam. He had come home with what I would now recognize as quartan malaria, with recurrent fevers and chills and just the most miserable feeling on earth. He had that far-away look of someone who had spent too many days seeing things 19-year-olds shouldn't see. But it was the malaria that dragged him down.
Patients often ask what I think of complementary and alternative medicines. My answer is always the same: whatever works. I prescribe drugs derived from mold, tree bark, and sea sponges. It isn't the origin of the drug that matters. If it passes rigorous scientific tests (so-called "Western medicine") demonstrating clinical utility, I am happy to prescribe it. The scientific method doesn't care if you are Chinese or American, nor whether your drug is a natural product or a synthetic chemical. It doesn't even care if it is a drug.
But the artemisinin story interested me for other reasons. The first was the long history of tropical medicine, particularly malaria, and the Nobel prize. One of the very first Nobel prizes for Medicine (in 1902) was given to Dr. Ronald Ross for his discovery of its transmission via the mosquito. Five years later Dr. Alphonse Laveran received the prize for his discovery of the malarial protozoan. Both the Englishman and the French doctor served their overseas colonial empire. In 1897, the year Ross published his work, approximately one-third of British troops in the Indian Raj were incapacitated by malaria.
Malaria represented an endemic disease in the American South for much of the first half of the 20th century. While common wisdom holds that the problem was solved through liberal dousing with DDT (another outcome of war), in fact it began its retreat during World War I. With large numbers of American soldiers receiving basic training in Southern cities, the U.S. Government was concerned that they would contract malaria before shipping overseas to France.
The government's response was to set up so-called extra-cantonment zones, areas within which it assumed responsibility for public health. Mosquito eradication, the techniques for which had been pioneered in the Spanish-American War and its aftermath, was used in a widespread fashion for the first time in the American South, considerably reducing malaria incidence and setting the stage for the subsequent New Deal-era near-eradication (pre-DDT) of malaria in the United States.
For anyone interested in this story, read the definitive account by Daniel Sledge, my political scientist son, of whom I am very proud: “War, Tropical Disease, and the Emergence of National Public Health Capacity in the United States”, Studies in American Political Development 26:125-62, 2012.
DDT also earned its discoverer, the Swiss Dr. Paul Muller, a Nobel prize in 1948. The Swiss shared Muller’s discovery with the United States in the midst of World War II, and the U.S. military rapidly introduced it into war zones, dramatically reducing deaths due to malaria, typhus, and the panoply of other insect-borne disease, not just for soldiers but for civilians.
But the larger back-story is the connection between war and scientific progress in general, and medical progress in particular. Alfred Nobel himself personifies this connection. Nobel, as is well known, made his riches through the creation of dynamite, which he hoped would be used for peaceful purposes only. His subsequent recognition of, and his horror over, the co-option of his discovery for lethal military purposes led to his creation of the Peace Prize.
Alexander Fleming's discovery of penicillin languished until World War II, when two British scientists rediscovered it and brought it forward to treat wounded soldiers, the work supported by infusions of cash from the British and American governments.
One can make too much of these connections, of course. No malaria patient cares that artemisinin came out of a nasty jungle war. No one in a doctor's office getting any of penicillin's follow-on antibiotics cares that the antibiotic revolution was a byproduct of history's bloodiest war. They just want to be treated and cured, and not re-infected.
But it seems inescapable that the link between medical progress and war is a real one. Trauma medicine had its origins in military surgeons, and even the DaVinci robot so prized by urologic surgeons caring for prostate cancer patients had its beginnings in a DARPA contest to create a battlefield-ready automatic surgeon.
And cancer patients, as we all learn early in our training, owe a debt to war: the first effective chemotherapy agents were discovered as a result of a ship full of poison gas blowing up in Naples harbor in 1944. In short order the lymphopenic sailors served as the model for treating leukemia in children.
Some day, one hopes, such expensive advances, paid in the blood of innocents as well as the gold of governments, will no longer prove necessary. But for the moment, the essential tension embodied by the Nobel prizes, with their celebration of life paid for by the archetypal merchant of death, continues to vex us.