Oncology Times

Journal Logo

Musings of a Cancer Doctor

Wide-ranging views and perspective from George W. Sledge, Jr., MD

Thursday, July 28, 2022

​By George W. Sledge, Jr., MD​

The 2022 American Society of Clinical Oncology Annual Meeting is behind us, leaving us with presents large and small in the breast cancer field. It was, of course, the first time we had met in person since 2019, and I sensed real enthusiasm (and some mild trepidation) about this first post-pandemic get-together with 30,000 of my closest friends (with another 11,000 or so sharing the proceedings online).

The clear “Best in Show" for breast cancer, and perhaps for the field, occurred in the plenary session where Shanu Modi, MD, presented the results of the DESTINY-Breast04 trial of trastuzumab deruxtecan in HER2-low metastatic breast cancer.

By way of background, we have known for 3 decades that HER2 protein expression represents a continuum rather than a sharp yes/no in breast cancer. However, early studies with trastuzumab suggested that only patients with high (3+ by immunohistochemistry) HER2 expression in their cancers, or those with FISH-positive tumors, would respond to HER2-targeted therapy.

While this dichotomy between what we call “HER2-positive" and “HER2-negative" was rational in the naked monoclonal antibody scenario, more recent technologic advances, culminating in this year's ASCO presentation, have forced us to reconsider what we mean by “HER2-positive."

Following prior Phase II data suggesting that many breast cancers with lower levels of HER2 expression (1+ or 2+ by IHC, regardless of estrogen receptor status) might well respond to HER2-targeting therapy with the relatively novel antibody-drug conjugate (ADC) trastuzumab deruxtecan. This agent, of course, is well-known to physicians who treat breast cancer, given its approval for HER2-positive patients in more advanced metastatic settings. The drug has previously shown superiority to the other FDA-approved ADC, trastuzumab emtansine in a Phase III trial (DESTINY-Breast03), and as such has become a go-to drug in the post-trastuzumab/pertuzumab setting.

DESTINY-Breast04 randomly assigned patients with low (1+ or 2+) HER2 expression who had received prior endocrine therapy (if steroid receptor-positive) and 1-2 lines of chemotherapy to receive either a chemotherapy of physician's choice (chosen from a menu of options) or trastuzumab deruxtecan. The results were impressive, with improvements in both progression-free and overall survival; the latter in particular impressed with a median 6.6-month improvement in a field where post-endocrine therapy advances have been pedestrian at best.

Trastuzumab deruxtecan, one suspects, will rapidly become a new standard of care (a phrase I always hesitate to use, but warranted here) in the HER2-low population, though important questions remain regarding its use. In particular, the number of patients with so-called triple-negative breast cancer was relatively small, and while there was no obvious difference between the ER-positive and triple-negative outcomes, more data would certainly be helpful. And, given the emergence of another ADC, sacituzumab govitecan, in the triple-negative space, we will need greater nuance regarding best treatment options in triple-negative-no-longer subgroups. Undoubtedly, we will be subjected to marketing campaigns saying “my ADC is better than your ADC" in the absence of head-to-head comparisons. Given the somewhat scary interstitial lung disease seen with trastuzumab deruxtecan, physicians will need improved management skills with these promising yet toxic agents. What we even call “HER2-low" will require some thought and care; the discussion accompanying the presentation shared data suggesting that pathologists are not particularly agreeable from a concordance standpoint. We can hope that a future ASCO plenary session will deliver answers regarding the adjuvant use of the agent. Time will tell.

Two other presentations examined other antibody-drug conjugates. Hope Rugo, MD FASCO, discussed the TROPICS-02 trial that essentially attempted to recapitulate DESTINY-Breast04 with the TROP-2-targeting ADC sacituzumab govitecan in ER-positive metastatic disease. Median progression-free survival (PFS) was improved by only 6 weeks and overall survival results were immature, so for the moment sacituzumab remains in the triple-negative domain.

Ian Krop, MD, PhD, presented early data with patritumab deruxtecan, a HER3-targeted ADC, in relatively heavily pretreated HER3-positive breast cancers, with promising response rates seen in ER-positive, triple-negative, and HER2-positive patients, though as always with ADCs, interstitial pneumonitis is an issue. Worth keeping an eye on, though too early to claim victory.

Though DESTINY-Breast04 stole the show, several other presentations provided important, or at least interesting, insights into the state of the field.

We have known for many years that the PI3K/AKT/mTOR pathway represents an important alternate signaling pathway in estrogen receptor-positive breast cancer, and a principal means by which other receptors cross-talk with the estrogen receptor, promoting resistance to endocrine therapies. This in turn led to the development of several agents targeting this pathway. Two of these (everolimus for mTOR and alpelisib for PI3K inhibition) are already FDA-approved agents for metastatic breast cancer, and AKT targeting (with capivasertib) is far down the development path.

This year saw an important update on capivasertib. Previously, this agent had been shown to improve progression-free survival in the randomized Phase II FAKTION trial; this year's update looked at the value of biomarker analysis in this trial and provided a peek at the survival effects of this agent.

In brief, FAKTION showed improvements in both progression-free and overall survival, both in the 6-month range. These are results from a randomized Phase II trial and, as such, are underpowered, though both suggestive and promising. We'll need to await the results of the Phase III trial with this agent. Of interest, and of real potential importance, capivasertib appeared to exert its greatest effects in the trial subpopulation with pathway alterations (as measured by next-generation sequencing) in PIK3CA, AKT, and PTEN. We need to be careful regarding subset analyses of randomized Phase II trials, but these results are consistent with what we think we know about how this drug should work, and we will again await the Phase III trial biomarker results with great interest.

Staying in the confines of estrogen receptor-positive breast cancer, we also saw an interesting presentation by Kevin Kalinsky, MD, and colleagues touching on the vexing question of management of patients whose cancers have progressed on frontline CDK 4/6 inhibitor therapy. This investigator-initiated, randomized Phase II trial allocated patients to either fulvestrant alone or to ribociclib and fulvestrant. This MAINTAIN trial demonstrated statistically significant improvements in progression-free survival and clinical benefit rate favoring the combination of fulvestrant and ribociclib.

This was a well-conducted and interesting trial that left me confused as to what to do in clinic. Randomized Phase II trials rarely offer definitive results; they are, in essence, parallel Phase II trials that suffer from (over)comparison. In addition, it is uncertain what question this trial actually asked. Most patients had received initial CDK 4/6 inhibition with palbociclib. Does a positive result imply that we should continue CDK 4/6 inhibition when we change from an aromatase inhibitor to fulvestrant, or does it tell us that ribociclib is a better (i.e., less susceptible to development or resistance) CDK 4/6 inhibitor than palbociclib? And though the trial is positive regarding its PFS primary endpoint, overall survival results are immature, and we know from long experience that a modest PFS increase is not a particularly robust surrogate for overall survival.

The MAINTAIN trial did include an interesting biomarker analysis that may prove important. Crossover to ribociclib appeared to be particularly useful in patients with wild-type ESR1; in contrast, mutations in ESR1 (the estrogen receptor gene) largely abrogated benefit of crossover. Biomarker analyses in randomized Phase II trials, where one deals with ever-smaller subsets and therefore ever-more-dangerous statistical analyses, are inherently suspicious. But, if this one pans out in larger datasets, this could make the MAINTAIN approach an interesting one for an important biological subset.

Finally, with regard to CDK 4/6 inhibitors, we saw the final overall survival analysis of PALOMA-2, presented by Richard Finn, MD. PALOMA-2 was the first randomized, frontline, metastatic trial of an aromatase inhibitor with or without a CDK 4/6 inhibitor, utilizing, respectively, letrozole and palbociclib. The median overall survivals for letrozole alone and for the combination were 51.2 and 54 months, a result that did not reach statistical significance. The presenter, in a feat of death-defying statistical acrobatics, attempted to convince the audience that this was, in fact, a positive trial. I was unconvinced. The basic issue was that the trial was missing a third of patient survival data, which is usually one of the easier datapoints to assess.

A question raised by the Kalinsky and Finn presentations, as well as by prior palbociclib randomized trials in the adjuvant and metastatic settings, is whether it is simply the weakest of the three CDK 4/6 inhibitors. I consider this still an open question; cross-trial comparisons are always fraught with difficulty, and palbociclib trials tended to enroll the toughest patients from the standpoint of a prior endocrine resistance. That being said, I really wish that the drug had a single unalloyed success in the adjuvant or metastatic with regard to either disease-free survival (in the adjuvant setting) or overall survival (in the metastatic setting).

The early disease session held less interest for me this year, as there were no great adjuvant breakthroughs to be seen, though several interesting biomarker analyses. To me, the most interesting early disease trial was the LUMINA trial with Tim Whelan, MD, which tested our ability to eliminate post-lumpectomy radiation therapy in patients with smaller (T1a and b) Luminal A breast cancers.

To make a long story short, this prospective registry trial suggests that for patients aged 55 and older with T1N0 Grade 1 or 2 invasive ductal Luminal A cancers (defined pathologically as: ER ≥1%, PR>20%, HER2-negative, and Ki67 ≤13.25%) and negative margins of resection receiving endocrine therapy, omitting radiation therapy is a reasonable option, with a 5-year rate of local recurrence of 2.3 percent. While these results are compelling, median follow-up is still relatively short. More importantly, the devil is in the details here, with a requirement for careful pathology a necessity if one is to follow this approach.

How do we put all this together? Looking from the 20,000-foot view, Destiny-Breast04's superb trastuzumab deruxtecan's impressive overall survival results in the HER2-low metastatic setting was the clear winner in 2022, and the sole truly practice-changing result. It was a win, not just for a specific drug, but for the concept of applying antibody-drug conjugate therapy to a broader patient population than what we normally think of as the biomarker-selected population. I suspect it will cause several companies to revisit their drug development strategies (sometimes for better, sometimes for worse). Numerous companies are developing such agents, and we'll see many of them at future ASCO Annual Meetings.

On a personal note, it was good to be back in Chicago and to see old friends in person. Indeed, there were people who I had interacted regularly with via Zoom meeting over the past 2 years who I only saw for the first time in the flesh in Chicago. That it was a super-spreader event for COVID-19 I do not doubt; several colleagues and friends came down with the virus on their return. But that did not diminish my pleasure in sitting in the audience, and I look forward to sharing future years with all of you, even if I continue wearing a KN95 mask.

Tuesday, June 7, 2022

​By George W. Sledge, Jr., MD​

My wife and I were walking after dark on a sidewalk near our home. Suddenly—and suddenly is the word—we both stopped. A few feet ahead we saw the sinuous form of what, in the dark, seemed to be a snake. But it wasn't, as a few seconds perusal confirmed. Instead, it was the husk of some plant, or rather two husks which had fallen close to each other, and which our brains had interpreted as a snake. There is something curious about two people in Palo Alto, neither of whom had seen a snake in years, simultaneously imagining one out of some dead plant parts.

Yet this is what we are programmed to do. There is something in evolutionary biology called the Snake Detection Theory, which suggests that the threat of snakes played an important role in the creation of the primate brain. Our distant primate ancestors, swinging through trees, safely above the level of most other predators, were still easy pickings for whatever boa or viper happened by. Early detection of snakes was a potent driver of survival of the fittest.

Or so the theory goes. And it seems a pretty good theory, as pretty much every primate species has the ability to identify snakes almost instantaneously, including humans. Far faster, indeed, than our ability to recognize other dangerous predators.

Seeing those non-snake husks flooded my brain with memories of snakes past, memories I had not accessed in decades. Late in my teens, my family moved into a new home in a suburban development surrounded by empty, open fields. When we moved in, the lot next door was basically reclaimed Midwestern cow pasture. One early evening, as my father and I stood outside, a small, innocuous-looking garter snake wandered out of the next-door lot onto our driveway.

My father, the calmest human I've ever known, immediately freaked out. He ran into our garage, pulled out a garden hoe, and proceeded to club the poor creature to death. I tried to talk him out of it. Garter snakes are hardly dangerous. They eat bugs, not suburbanites. But to my dad, who had grown up on a farm in lowland North Carolina, this was something primal, something he just had to do. It was the only time I can ever remember him losing control.

Looking back, it's clear to me that the monkey brain part of my dad was experiencing fear of that poor little non-venomous creature. Fear of snakes is exceptionally common, occurring in as many as half of humans. It doesn't appear to matter if you grew up in the Bronx or in the Appalachian Mountains; fear of snakes is incredibly common. Then there is ophidiophobia, a true paralyzing phobia around snakes. I don't think dad had that, but it was something close. Fear, as all know, is the father of hatred.

And then another memory, close on to the first. I trust this one a bit less, so feel free to consider this a (potential) tall tale. Once, while visiting relatives in North Carolina as a young child, my uncle took his sons, my brother, and me along to property he had near a lake. To get to the lake, we traversed dense sodden undergrowth in a dark forest. My uncle, a very tough guy, led the way. At one point he stopped, yelled “stand back," and stamped his foot down. And then—and here is where I do not know if this is real or manufactured memory—he leaned down and cut off a snake's head with a machete. “Water moccasin," he said, matter-of-factly, threw the head and tail in different directions, and on we went. The story seems so preposterous that I hesitate to share it, but the memory is strong. All I can say is that, if it had been me leading the way, I would have run screaming in the opposite direction had I seen a water moccasin. They are nasty, dangerous creatures.

My wife and I got talking about this and she immediately came up with three more snake stories from her younger years. I suspect that this is something commonplace, this memory for snakes. I've never been harmed by a snake, in contrast to dogs (dog bite), cats (cat scratch), bees (bee stings), or mosquitoes (too many bites to count, though no malaria or West Nile virus yet), but snakes somehow stick in one's memory like almost no other animal. And I still love dogs.

If you live in the United States, it is hard to die of a snake bite. On average around 7,000-8,000 people per year receive venomous bites, of which only about five die. Crosswalks are far more dangerous, with more than 6,500 pedestrian deaths per year. And many snakebite deaths appear readily avoidable. Wikipedia (what did I do before Wikipedia?) has a page devoted to snakebite deaths by decade.

The last decade includes descriptions such as “[Name] was bitten on the right hand during a service at his Full Gospel Tabernacle in Jesus Name church"  and “While camping at Sam A. Baker State Park in Missouri, [Name] walked outside, saw a snake, and brought it to his son's attention. When he picked it up, the snake bit him." Note to self: don't pick up venomous snakes. And then my favorite: “He was likely killed by one of the 24 venomous snakes he kept in his home." Addendum note to self: don't fill house with potentially lethal animals.

But outside the United States the story changes. The World Health Organization estimates that some 5.4 million people are bitten per year, and that around 81,000 to 138,000 people per year die of snakebites. And the deaths are not the whole story: irreversible kidney failure, permanent disability, and limb amputation occur at 3 times the death rate. Unsurprisingly, the toll of snakebites is greatest among agricultural workers in low- and middle-income countries, and children are particularly vulnerable. The victims are precisely those unlikely to have access to hospitals and antivenoms. Snakes still exert Darwinian pressures in the modern world.

You might think that snake venom would be a good source of novel therapeutics. And yet, perhaps because medicinal chemists are smart and don't like handling poisonous snakes, you can only point to a very small number of medicines derived from snake venom, and only one of these (captopril, the first ACE inhibitor) can I ever remember having prescribed to a patient. I cannot find any evidence that a snake venom derivative has ever been used for cancer, though an interesting recent paper suggesting a possible COVID-19 application. Alas, vaccines, monoclonals, and Paxlovid appear to have cut off the path to FDA approval for this promising agent. Or the researchers got snake-bit and left the field.

Finally, of course, there is the snake as symbol. In Judeo-Christian mythology, snakes are evil, starting with Eve's interaction in the Garden of Eden, but also in phrases such as the Psalms' evil men who “make their tongue sharp as a serpent's, and under their lips is the venom of asps." In the New Testament, the Pharisees are called a “nest of vipers" because of their hypocrisy. Snakes get a lot of bad press in the Bible, and this has reverberated throughout Western culture. Shakespeare's Macbeth, for instance, is full of snake imagery.

And snakes get a bad rap in common culture. We refer to someone as a snake in the grass, or say that another is a rattlesnake, or that a vicious person is full of venom, or that a particular office environment is a real snake pit, or that some con man is a snake oil salesman. The only relatively kind cultural reference for snakes that comes to mind is the feather boa worn as a fashion accessory, and even that doesn't give me positive feelings about boa constrictors, who I can easily imagine choking the life out of me. In contrast, while I wouldn't want to go mano-a-urso with a grizzly, I have no trouble calling someone a big teddy bear as a sign of his essential sweetness.

Or, if you need further evidence of how the world thinks about snakes, look at an endless stream of “B" movies. Here I reference Tubi, my favorite online source of guilty pleasure trashy movies, with names and thumbnail sketches: Venomous, wherein “a virus transmitted from snakes mutated by a terror attack on a government lab brings on a quarantine as the snakes have come above ground to attack"; Snake Outta Compton, in which “a rap group…is a city's only hope in a battle with a giant mutating snake monster"; Snakes on a Train (enough said); King Cobra (“30 feet of pure terror"); Python, where “a military cargo plane carrying a smart, prehistorically massive python crashes in suburbia"; and finally, Boa v. Python, because apparently python alone wasn't enough. In contrast, put “cows" into Tubi's search box and you get Village of Smiling Cows and The Moo Man.

One of the interesting things about these movies is not just that snakes are dangerous, but that they have the power to transform. In Snakeman, a research group searching for the fountain of youth in a jungle stumbles across its giant guardian snake, whose venom transform its victims into (you guessed right) snakemen. The movies in which people are transformed into snakes are numerous: 7th Voyage of Sinbad, Conan the Barbarian, The Golden Child, and Night of the Cobra Woman, among many others. These movies, in turn, reflect an ancient tradition of snakes as transformative creatures, the shedding of skin symbolizing the potential for regeneration and immortality.

Indeed, many ancient cultures revered snakes as religious icons. Basmu was the ancient Mesopotamian fertility goddess. The Norse had Jormungandr, whose venom does in Thor during Ragnarok, the Scandinavian doomsday. The Egyptians had no less than five snake deities: Wadget, Renenutet, Nehebkau, Meretseger, and Apep.

While some of these were scary deities (don't ever, ever mess with Renenutet, who breathes fire and can still the hearts of men with a single glance), Wadjet is the guardian of children and childbirth, and Meretseger is the goddess of mercy, though she did apparently take a dim view of grave robbers. Good folks, anyway, Wadget and Meretseger. The world was full of snake-like divine beings revered by this or that religion: you respected snakes in the old days, even as you bowed before them and feared them.

And then, of course, there are the Greeks. Unlike Basmu, Meretseger and Jormungandr, who retain few worshippers, the Greek view of snakes continues with us to this day. Medusa, whose hair was replaced with venomous snakes, could turn one to stone if you stared into her eyes, which put something of a damper on her social life. Amazon Prime is running a Medusa commercial, including Nicki Minaj lyrics, in which the ancient turn-them-to-stone curse is lifted through Medusa's purchase of stylish sunglasses from Amazon Prime. A television commercial celebrating a two-and-a-half millennium-old myth is immortality of a sort.

But doctors everywhere are linked to ancient Greek snakes through the rod of Asclepius, the Greek god of medicine who is instantly recognizable through his rod, a snake-entwined staff. That symbol is now the global symbol for medicine, included in the logos of health organizations everywhere, such as the World Health Organization, the American Medical Association, and my own Stanford University School of Medicine.

The origins of the Rod of Asclepius are hidden in the distant past, but in the healing temples of ancient Greece (called asclepeions) non-venomous snakes slithered freely through the dormitories where the sick and injured slept. I'll leave you, dear reader, with that vision: imagine yourself rounding some morning in an asclepeion. Wear high boots and bring a rod. And remember that we in the medical profession honor the old ways, snakes and all, even as we create the new. Someday my snake will eat your crab, cancer.

Tuesday, March 22, 2022

By George W. Sledge, Jr., MD

We're exhausted, dismayed, and angered over the way our lives have gone off track these past 2 years. COVID-19 never seems to end, and the new normal is distinctly abnormal. Leaving aside the minor inconveniences, the effects on society have been ample. To name a few: economic disruption (job losses due to shutdowns, supply chain issues, stock market flip-flops), social disruption (increased murder rates, increased crimes against women and people of Asian descent), political polarization piled on existing political polarization to the detriment of basic public health measures, schools closed and 6-year-olds trapped at home with working parents whose interplay with their co-workers is by Zoom conferences. But more than anything else, a nagging sense that nothing is safe and predictable anymore has led many to live in a permanent state of unease.

All of us in the health care professions know the human toll, the lives lost, the exhausted ICU nurses and ED physicians, the noisy ventilators, and the sad family conferences. And all this followed by the long COVID symptoms that we are just beginning to understand and will wrestle with for years. We all have stories we could share, but we are too tired to care anymore.

​Already 2019 seems like a different era, some facsimile of an old Jetsons Saturday morning children's cartoon, when life was dominated by technologic advances and Mother Nature was just one minor special interest group, recognized but usually ignored by a populace enamored by the latest shiny new gadgets. An era where one could go online and get a ticket to Bangkok or Paris, stand in line at a crowded airport, sit next to someone with a mild cough, fly overseas for 12 hours and think nothing about it.

That was BC; Before COVID. In the After Covid (AC) universe, a trip to the local grocery store can seem fraught with hazard. Fly overseas to a medical conference? Forget it. Get on a cruise ship, those floating coronavirus hot boxes? Are you nuts? We no longer have the freedoms we took for granted, and many of the limitations are the self-imposed ones derived as much from our own psyches as from public health departments or the CDC.

What's worse, we no longer trust our fellow citizens. Is the woman standing behind me at the checkout line, her homemade cloth mask covering only her chin, vaccinated and boosted? Is her child's cough some mild rhinovirus or the latest highly transmissible variant? Will I get in an argument over mask wearing at my child's school? I never used to worry about someone's flu vaccine status—perhaps foolishly—but now I ask every patient I see in clinic about their COVID shots.

And underling everything else, this question: when will it all end? A once-in-a-century pandemic leaves us with few relevant historical comparisons. The last great pandemic was the 1918-1920 Great Influenza, which killed somewhere in the range of 50-100 million people in a world less populated than today. That pandemic came in four waves: a mild one with characteristics of a regular flu season, followed by the major killing fields of late 1918, followed by another significant outbreak that killed tens of thousands in the U.S. in 1919, followed by a 1920 series of localized but lesser outbreaks. So roughly 3 years of fear and deaths, and then back to normal.

Of course, that flu never actually went away. It mutated into something less dangerous, its progeny continuing until today. It still kills in the thousands per year, though fewer in the era of COVID: masks work. Perhaps influenza's slide from pandemic to the endemic is our fate with COVID-19. Or perhaps, and this is devoutly to be yearned for, deliverance will come in the form of a universal COVID vaccine, as some have suggested.

Either of those two outcomes would be welcome. But history offers other, more dismal scenarios. Imagine, if you will, that you are miraculously transported to Stratford-upon-Avon in 1564. If you attend Holy Trinity church on April 26, you will see the baptism of William Shakespeare. The parish register for that date, probably a few days after his birth, lists him as Gulielmus filius Johannes Shakspere."

But don't linger there. On July 11, the same parish register informs us “Here begins the plague." After that there is a steady drumbeat of plague-related deaths lasting through the end of the year, including many young children. Shakespeare was lucky.

Follow him as he moves to London, sometime in the late 1580s, and the plague is waiting there as well. In 1592, the plague killed roughly one Londoner in twelve. The city authorities had strict rules shutting down the playhouses and other public spectacles once plague deaths reached 30 per week, so young Shakespeare uses his down time to write his poem Venus and Adonis, which earns him the patronage of an Earl of Southampton. The poem includes these lovely lines: “Long may they kiss each other, for this cure/…To drive infection from the dangerous year/That the star-gazers having writ on death/May say, the plague is banish'd by thy breath." Or, as in the case of COVID, “spread by thy breath."

In 1603, the plague came again in a big way, killing 30,000 people in a city of 200,000, and the playhouses were shut down to prevent spread of the infection. And in 1606, the plague returned, and again Shakespeare and his company faced government-enforced shutdown and loss of income.

Shakespeare's company, originally the Lord Chamberlain's Men during Queen Elizabeth's time, then the King's Men (for they were favorites of King James) left London and toured the provinces when the playhouses were shut down. But if one stayed in London, and the Black Death was known to have afflicted a family member, one's house was boarded up for a month and the house's occupants forbidden to leave. In James Shapiro's wonderful book The Year of Lear, subtitled Shakespeare in 1606, the author makes the case that the playwright's landlady may have died of the plague at the age of 40. If so, as in 1564, Pasteurella pestis may have come too close for comfort.

Shakespeare mentions the words “plague" and “pestilence" in his plays, though usually not as the centerpiece of the works. Perhaps the memories are too painful for the citizens of London. But the plague is frequently there in the background, as in real life. Take Romeo and Juliet, for instance. When Romeo's friend Mercutio is stabbed and lays dying, he says “a plague a' both your houses." To modern ears, this is just a witty metaphor, but if you were attending the Globe Theatre then, how would it have sounded? You've seen friends die in days, purulent painful buboes turning lymph nodes into torture chambers. You are scared for your life just thinking about it. Perhaps the phrase would be the equivalent of saying something like, “I hope you both die of cancer." If you lived through 1603, where 15 percent of the population perished in a matter of months, this was no innocent metaphor. It was a curse.

Later in the play, in what will have tragic consequences for the young lovers, a letter is not delivered to Romeo in time. The reason is the plague: “the searchers of the town, /Suspecting that we were both in a house/Where the infectious pestilence did reign, /Sealed up the doors, and would not let us out." Quarantine is a very old idea, and for Shakespeare's audience a living presence.

The plague would not disappear from London until after 1666, when the Great Fire of London, and the subsequent rebuilding of much of the city in brick rather than in thatched roofs, made it less friendly to the fleas and rats that transmitted the plague.

Here's the thing that amazes me: the Black Death first arrived in England in 1348. 1603 was 255 years after that, and the bubonic plague has another half century or more to torment the citizens of England. Two hundred fifty-five years ago was 1767; imagine COVID arriving then and coming back every few years, killing only a few in “good" years and 15 percent in the bad years. And having a half century more to go before disappearing for good. What an intensely depressing thought. But that was the world in which Will Shakespeare lived.

If there is any cheer to be derived from the infectious disease history of Shakespeare's times, those long-ago plague years, let it be this: in 1606, when the Globe Theatre shut down for most of a year, Will Shakespeare didn't goof off. Shakespeare scholars believe that he penned Macbeth, Antony and Cleopatra, and King Lear, all within a 12-month period. Has any writer ever had a better year?

Or, perhaps, consider late 1665. This is the last time that the plague will visit England, but its spread has temporarily shut down Cambridge University. The students are sent home until things cool down in 1666 and classes resume. One of the students, young Isaac Newton, goes home to Woolsthorpe Manor. In 1726, near the end of a not-too-shabby scientific career, he shares this anecdote with his biographer: “After dinner, the weather being warm, we went into the garden & drank tea under the shade of some apple trees… he told me, he was just in the same situation, as when formerly, the notion of gravitation came into his mind…occasioned by the fall of an apple, as he sat in a contemplative mood."

Somewhere, in some lab, or perhaps in a crowded apartment, some grad student or postdoc, who's life has been put on hold by the pandemic, is sitting in a contemplative mood. Newton's apple, I want to believe, has fallen out of the sky and plunked this smart kid on the head and we just haven't heard about it yet. Perhaps finding the cure for cancer or working out the development of the universal vaccine that will end COVID-19 once and for all, but in any event creating something that will transform the world. There is, after all, precedent; the greatest playwright in human history and the human race's greatest physicist both allow us to hope that something good will come out of all this pain and suffering.

Wednesday, March 9, 2022

By George W. Sledge, Jr., MD​

Over the holidays I watched Peter Jackson's documentary The Beatles: Get Back, as well as a separate documentary in which the producer Rick Rubin spoke one-on-one with Paul McCartney about his music. The two shows were a revelation for an old Beatles fan.

In my generation, you thought either the Beatles or the Rolling Stones were the greatest rock band of all time. As a tried-and-true Beatles fan, I consider the years from 1964 to 1969 to be the peak period for this genre. Similarly, the breakup of the Beatles in 1969 was, for many in my generation, a tragedy of epic proportions, somewhere in the range of the sinking of the Titanic.

Jackson's long documentary—an exhausting 9 hours—chronicles the near end of the band's great run. The four musicians are tired of each other, and the band is clearly on its last legs. John Lennon is accompanied to the sessions by Yoko Ono, a simultaneously quiet but obtrusive presence. He has largely checked out of the creative process and is (as we now know) a heroin addict. George Harrison sulks, having spent much of the last few years standing in the background while Lennon and McCartney (the greatest songwriting team of all time) have worked their wonders. He is ready to shine, creatively, but feels ignored, to the extent that he walks away from the sessions, temporarily quitting the band. Ringo Starr has the look of someone grieving over a loss that has not yet occurred but soon will. McCartney, meanwhile, is trying to hold everything together, but in a bossy way that irritates the others.

Around them float various hanger's-on: producers, bean counters, a ridiculous documentary maker who films their every word (often without their knowledge using hidden microphones), managers, technicians, girlfriends (it is not just Yoko who makes her presence felt). The musicians are the center of the creative process, but they rely on a not-insignificant infrastructure.

Soon they will compose their last collective music, and The Beatles will depart the stage: now they belong to the ages, one wants to say of their passing. It is a profoundly depressing piece, a sad tale of breakup as something special falls apart under the strain of a decade of continuous travel, composing, and recording. Though still in their twenties their private lives have begun to diverge, John with Yoko Ono, Paul with Linda Eastman (soon to be McCartney). They are ready to move on. Perhaps inevitable, but maybe not.

Watching them filled me with sadness, and not just because of the band's breakup. We know how all this turned out, and what happened to them in the following decades. Lennon will be murdered outside The Dakota in Manhattan, done in by a mediocrity with a handgun. Harrison, who smokes throughout the sessions, will first develop head and neck cancer, survive that, be stabbed by a deranged fan, survive that, and then succumb to non-small cell lung cancer, developing brain metastasis. Linda McCartney, the love of Paul's life, and a talented photographer in her own right, will develop an aggressive breast cancer, receive both standard and high-dose chemotherapy in the 1990s, and then develop liver metastases. She and Paul will go horseback riding 4 days before she dies at the age of 56. Paul will then go to a very dark place for several years.

The oncologist in me always wonders whether things would have turned out differently had Harrison and Linda McCartney developed their cancers at a later point in time. Would Harrison's cigarette-induced cancer have responded to a checkpoint inhibitor or a third-generation EGFR inhibitor? Would McCartney's aggressive cancer have responded to a novel kinase inhibitor or monoclonal antibody or antibody-drug conjugate, the transplant bludgeon replaced with a precision medicine stiletto? Might it have been cured in the adjuvant setting? Modern medicine is full of “what ifs," such is the pace of modern discovery. So, watching Get Back is filled with anticipatory sadness, both for the lost music and the premature deaths we know are coming.

And yet, and yet… If this is the Beatles at their worst, at their low point, they are still something quite miraculous, conjuring great music out of thin air. Their creative process generally starts with one of them pitching, not a song, but part of a song. They spend hours bouncing lyrics and tunes back and forth, often giving up on it only to come back later, finally finishing triumphantly with something that barely resembles the original idea. Get Back, the documentary's title, is also a recurring work-in-progress song throughout the film, right up until it is presented to the world in a rooftop concert. The song itself is interrupted by the London police, called by uptight neighbors on Savile Row who fail to appreciate history in the making.

Watching them create is, as I said earlier, a revelation. The parallels to the creative process in the sciences are obvious. It goes something like this: the act of creation never occurs in a vacuum or a desert. One puts in long periods of hard work learning the basics of the field. In the Beatles case, this was thousands of hours in bars in Hamburg, Germany, learning their craft. It is the basis for what comes after; you only know what the right questions are once you have spent enough time on the stage.

While history is replete with brilliant individual scientists, my experience in clinical research has been that the best ideas come from the prolonged interaction of talented individuals with overlapping skill sets, the equivalent of having keyboard, guitar, and drum players riffing on a particular musical theme, improvising, altering until they find just the right note, the right word or phrase. The musicians or scientists in turn are supported by a well-developed infrastructure which is, like the Beatles' Abbey Road recording studio, agnostic to the ideas being studied yet crucial to their success: biostatisticians, research nurses, data managers, and the like. And the patients, of course, the patients who lay their lives on the line waiting for us to get smarter.

And sometimes—this happens in the middle of the documentary—one goes outside the original group to add some crucial yet thoroughly unanticipated element to the mix. In the documentary, this is a blues musician Billy Preston, an old friend of the Beatles from their Hamburg days, who drops by to say hello and ends up being conscripted into the act of creation. The joyous energy he brings to the lethargic group transforms their creative process.

There is also a wonderful moment in the documentary when one of the band members wonders what it would sound like to bang on an anvil at a particular point in a song. The anvil mysteriously appears, conjured out of the ether by their road manager Mal Evans, and the sound Ringo Starr makes with it now graces the Beatles' Maxwell's Silver Hammer. Sometimes, the small tweaks matter.

And finally, of course, there is the unexpected. The creative process is essentially messy. It is not something you can apply process engineering to and hope for anything other than mediocrity. The old Six Sigma idea—the rigorous removal of errors from a process, leaving an enterprise clean and pristine—is antithetical to the creative process. In Paul McCartney's interview with Rick Rubin, he said something that stuck with me: “I specialize in audacious mistakes."

I suspect that in the clinical enterprise, as opposed to the laboratory or (sometimes) in the best clinical research, we make far too few audacious mistakes. One can understand why: audacious mistakes attract lawsuits. They fail more often than they succeed. But they are crucial for creativity, and perhaps this is part of why the American health care system is, ultimately, a failure, a disastrous mess providing expensive yet ineffective care. We don't experiment enough, we don't dream enough, we don't audaciously mistake our way to the actually valuable. But we are great at wringing incremental gains out of a system created by a dysfunctional health insurance apparatus designed to maximize enterprise profits for every stakeholder except the patient.

Anyways, that's what I think. But feel free to ignore anything I've said if you are more of a Stones than a Beatles fan. Or if you don't know who they actually were. In which case, I pity you.

Thursday, December 9, 2021

By George W. Sledge, Jr., MD​

Sometime around 120,000 years ago, an early member of our species fashioned a set of what archeologists think were clothes-making tools. The tools, fashioned from bone, were recently excavated from the charmingly named Smugglers' Cave in Morocco. They are, as far as we can tell, the first identifiable tools specifically made to produce clothes.

Things moved at a slower pace in those days, so we can safely say that had Popular Science or Scientific American been around this would have qualified as the most important invention of the millennium. If your ultimate goal as a species was to expand from your African heartland to the Eurasian land mass and beyond, then having a clothes-making toolkit certainly qualified as a major achievement.

Of course, Popular Science wasn't around, nor writing, nor any oral tradition we might rely on, so we can't identify the inventor. But we can identify with that nameless person, because making new things, useful things, is what we do as a species. One suggested alternative name for Homo sapiens is Homo faber, Latin for “tool man," or if you prefer, “man the maker." Homo faber was sitting by a fire in Smugglers' Cave. Though it may well have been “tool woman," not “tool man."

I have a fascination with technology. I'm not particularly handy, but tools fascinate me, as do their history. Take the humble screwdriver. The principle of the screw itself has been around a long time, first described (and perhaps invented) by Archimedes in the third century BC, at which time it was used as a screw pump to lift water from the Nile. From ancient Greco-Roman times through to around the 15th century, the principle of the screw was used primarily in olive and grape presses. Perhaps Archimedes, like many basic scientists, wasn't all that great at translating things from brilliant concept to public utility. No VC firms were around in 250 BC. The screw was a great idea stuck in mediocrity, a failure of imagination and inadequate development.

Screws as fasteners didn't come into their own until late in the Medieval era, when they started being used to put pieces of armor together; basically, they were military technology. We get some sense of this in Shakespeare's Macbeth, where Lady Macbeth tells her wavering husband to “screw your courage to the sticking place/And we'll not fail." It's an odd-sounding phrase to modern ears. Literary scholars suggest that the imagery refers to a crossbow, where a string is pulled taut by turning a wooden screw to its fullest extent, hence the “sticking place," before firing the crossbow bolt.

Screwdrivers were invented somewhere in Germany or France. Their first recorded appearance is in the Housebook of Wolfegg Castle, an illustrated DIY handbook written sometime between 1475 and 1490. As with the proto-technologists of Smugglers' Cave, we don't know the inventor.

Early screwdrivers were slotted, or flat-blade. We still have them in our toolkits. Both screws and screwdrivers were little used until the Industrial Revolution, when Job and William Wyatt of Staffordshire, England, created a screw-making machine; by 1780 their 30-worker shop was making 16,000 screws a day. Screws were suddenly cheap and ubiquitous.

The Phillips screw, and the Phillips screwdriver, also to be found in my toolkit, weren't invented until the 1930s. The Phillips screw, named after the American industrialist Henry F. Phillips, is now the most popular one on the planet, widely used in a myriad of industrial processes and in every home. It is the perfect industrial invention primarily because the screwdriver self-centers on the screw, something the flat-blade screwdriver could not do, and therefore does not slide off the screw on a busy assembly line. Phillips convinced General Motors to use the screw in Cadillacs, and the rest, as they say, is history. At least we know the name of the inventor. It wasn't Phillips, who bought it from an engineer named John J. Thompson.

The names don't matter much, of course, but the lowly, homely—literally homely—screwdriver is a pretty good template for toolmaking by Homo faber. It began with a brilliant scientist conceptualizing (or describing or inventing: we're not sure) the screw, but for something like 1,800 years no one could think of any use for it beyond crushing grapes. Its use as a fastener required some unknown tinkerers, one of whom made the first screwdriver.

Even then, it was another three centuries before anyone figured out how to make cheap screws, and yet another century before someone made the intellectual leap to the modern cruciform screw-head and its cognate screwdriver, perhaps its ultimate practical destination. Our tools evolve, often in response to our needs, definitely as part of a larger body of knowledge and experience, through progressive experimentation and tinkering, until they become so commonplace that we never give them a moment's thought.

One of my favorite books is Francis Bacon's Novum Organum, first published in 1612. It outlines the agenda for the modern scientific revolution. I sometimes think that, were every other scientific textbook to disappear, and a mind-wipe cleared the scientific method from our collective consciousness, we would be fine if Novum Organum remained. I'm always surprised how few scientists have ever heard of it. The men who created the Royal Society considered Bacon's work to be the foundation stones of modern science.

Bacon's work is full of pithy aphorisms. “Knowledge is power" is the best-known, though I have always liked “Nature, to be commanded, must be obeyed." But Novum Organum is important primarily because it champions observation and experience over pure reasoning, particularly observation used to test a hypothesis. Central to this testing is our use of tools. Again, quoting Novum Organum: “Neither the naked hand nor the understanding left to itself can effect much. It is by instruments and helps that the work is done, which are as much wanted for the understanding as for the hand."

As much wanted for the understanding as for the hand. I love that phrase. When I think of what has happened to the biologic sciences during my career, and in consequence to the medical sciences, it is the new “instruments and helps" that have mattered the most. Technology is science in action. For better or worse, and sometimes both.

I'm a breast cancer doctor. If I were asked to point to the greatest single advance in my field in the past 30 years, I could point to trastuzumab for HER2-positive disease, the first in a long line of targeted therapies that have transformed the disease. But I would be wrong. The real advance was a tool: monoclonal antibodies made in hybridomas. I was a resident when Kohler and Milstein created the first hybridoma, and (the innocence of those days) refused to patent the idea. By the time I became a fellow, in the early 1980s, my mentors were already discussing all the current uses: as a tool for pathologists and laboratory scientists, and as a therapeutic both in its naked form and as an antibody-drug conjugate. It took a while, but the future was implicit in the act of creation. It was the tool that created all the other tools that made progress possible.

And hybridomas were just the beginning. Polymerase chain reaction (PCR) technology appropriately won its inventor a Nobel prize, as did CRISPR/Cas9 gene editing its inventors. Neither is the equal of the decoding of the double helix in terms of basic scientific discoveries. PCR was a fairly minor riff on what was already known in terms of the biology of life, a cobbling together of existing techniques. CRISPR, let's face it, is important to bacteria but not very important in human biology. It's value, like that of PCR, is as a tool. And what great, transformative tools both are, as much wanted for the understanding as for the hand.

What will be the next great tool in our scientific toolkit? CRISPR was an immediate hit, as were monoclonals. Other technologies may represent slow burns, perhaps not requiring the same time scale as the evolution of the screw/screwdriver dyad, but not overnight successes either. One current technique that fascinates me is the recently described re-engineering of the genetic code to create novel types of proteins. For those of my readers who have not seen this, read either the original article by Robertson and colleagues in Science (2021; 372:1057-62) or the elegant discussion by Liu in the New England Journal of Medicine (2021; 385:1045-49).

The basic idea is simple yet fascinating. As we were all taught in high school biology, the 20 amino acid building blocks are the progeny of 3-nucleotide codons. There are 61 “sense" codons, with multiple codons coding for the same amino acid. This redundancy dates to some early point in evolutionary history but isn't really necessary for life. Robertson and colleagues “liberated" three of these redundant codons and assigned a new, unnatural amino acid to each. This in turn creates novel proteins that have never existed in the history of life. How cool is that?

I don't have a clue where this will lead, though one immediate result is that E. coli engineered to contain “liberated" codons are resistant to viral infection, as the codons that have been liberated are no longer available for “normal" virus-induced protein synthesis. I'm not suggesting that we start CRISPRing “liberated" codons into human eggs, and indeed I have no clue whether this technology will have any practical applications, but such a clever idea is sure to appeal to the scientific wanderlust of an enormous number of molecular biologists. More to come, I suspect.

I began this journey in Smugglers' Cave. Caves play an important role in the history of philosophy. Plato's allegory of the cave suggested that we are like prisoners trapped in a cave in which we only see flickering shadows of reality on the cave's wall: our senses are usually incapable of perceiving the true forms of nature, and those who escape Plato's Cave are feared and hated by those who remain behind. Bacon, in Novum Organum, was undoubtedly thinking of Plato when describing “Idols of the Cave," our individual prejudices that keep us from seeing the world clearly. In a time where science is under constant attack by cable TV pundits, it is easy to believe Plato's sad diagnosis for the human race.

But I like to think that human history is the history, not of Plato's dismal cave, nor its idols, but of Smugglers' Cave, where from the beginning progress has been driven by our ability to invent and create new tools. Those tools have, admittedly, been a mixed blessing for humanity, but our creative ability is what defines us as a species and represents our best hope for the future. Homo faber is always finding new things in Smugglers' Cave.