Journal Logo

Musings of a Cancer Doctor

Wide-ranging views and perspective from George W. Sledge, Jr., MD

Tuesday, March 22, 2022

By George W. Sledge, Jr., MD

We're exhausted, dismayed, and angered over the way our lives have gone off track these past 2 years. COVID-19 never seems to end, and the new normal is distinctly abnormal. Leaving aside the minor inconveniences, the effects on society have been ample. To name a few: economic disruption (job losses due to shutdowns, supply chain issues, stock market flip-flops), social disruption (increased murder rates, increased crimes against women and people of Asian descent), political polarization piled on existing political polarization to the detriment of basic public health measures, schools closed and 6-year-olds trapped at home with working parents whose interplay with their co-workers is by Zoom conferences. But more than anything else, a nagging sense that nothing is safe and predictable anymore has led many to live in a permanent state of unease.

All of us in the health care professions know the human toll, the lives lost, the exhausted ICU nurses and ED physicians, the noisy ventilators, and the sad family conferences. And all this followed by the long COVID symptoms that we are just beginning to understand and will wrestle with for years. We all have stories we could share, but we are too tired to care anymore.

​Already 2019 seems like a different era, some facsimile of an old Jetsons Saturday morning children's cartoon, when life was dominated by technologic advances and Mother Nature was just one minor special interest group, recognized but usually ignored by a populace enamored by the latest shiny new gadgets. An era where one could go online and get a ticket to Bangkok or Paris, stand in line at a crowded airport, sit next to someone with a mild cough, fly overseas for 12 hours and think nothing about it.

That was BC; Before COVID. In the After Covid (AC) universe, a trip to the local grocery store can seem fraught with hazard. Fly overseas to a medical conference? Forget it. Get on a cruise ship, those floating coronavirus hot boxes? Are you nuts? We no longer have the freedoms we took for granted, and many of the limitations are the self-imposed ones derived as much from our own psyches as from public health departments or the CDC.

What's worse, we no longer trust our fellow citizens. Is the woman standing behind me at the checkout line, her homemade cloth mask covering only her chin, vaccinated and boosted? Is her child's cough some mild rhinovirus or the latest highly transmissible variant? Will I get in an argument over mask wearing at my child's school? I never used to worry about someone's flu vaccine status—perhaps foolishly—but now I ask every patient I see in clinic about their COVID shots.

And underling everything else, this question: when will it all end? A once-in-a-century pandemic leaves us with few relevant historical comparisons. The last great pandemic was the 1918-1920 Great Influenza, which killed somewhere in the range of 50-100 million people in a world less populated than today. That pandemic came in four waves: a mild one with characteristics of a regular flu season, followed by the major killing fields of late 1918, followed by another significant outbreak that killed tens of thousands in the U.S. in 1919, followed by a 1920 series of localized but lesser outbreaks. So roughly 3 years of fear and deaths, and then back to normal.

Of course, that flu never actually went away. It mutated into something less dangerous, its progeny continuing until today. It still kills in the thousands per year, though fewer in the era of COVID: masks work. Perhaps influenza's slide from pandemic to the endemic is our fate with COVID-19. Or perhaps, and this is devoutly to be yearned for, deliverance will come in the form of a universal COVID vaccine, as some have suggested.

Either of those two outcomes would be welcome. But history offers other, more dismal scenarios. Imagine, if you will, that you are miraculously transported to Stratford-upon-Avon in 1564. If you attend Holy Trinity church on April 26, you will see the baptism of William Shakespeare. The parish register for that date, probably a few days after his birth, lists him as Gulielmus filius Johannes Shakspere."

But don't linger there. On July 11, the same parish register informs us “Here begins the plague." After that there is a steady drumbeat of plague-related deaths lasting through the end of the year, including many young children. Shakespeare was lucky.

Follow him as he moves to London, sometime in the late 1580s, and the plague is waiting there as well. In 1592, the plague killed roughly one Londoner in twelve. The city authorities had strict rules shutting down the playhouses and other public spectacles once plague deaths reached 30 per week, so young Shakespeare uses his down time to write his poem Venus and Adonis, which earns him the patronage of an Earl of Southampton. The poem includes these lovely lines: “Long may they kiss each other, for this cure/…To drive infection from the dangerous year/That the star-gazers having writ on death/May say, the plague is banish'd by thy breath." Or, as in the case of COVID, “spread by thy breath."

In 1603, the plague came again in a big way, killing 30,000 people in a city of 200,000, and the playhouses were shut down to prevent spread of the infection. And in 1606, the plague returned, and again Shakespeare and his company faced government-enforced shutdown and loss of income.

Shakespeare's company, originally the Lord Chamberlain's Men during Queen Elizabeth's time, then the King's Men (for they were favorites of King James) left London and toured the provinces when the playhouses were shut down. But if one stayed in London, and the Black Death was known to have afflicted a family member, one's house was boarded up for a month and the house's occupants forbidden to leave. In James Shapiro's wonderful book The Year of Lear, subtitled Shakespeare in 1606, the author makes the case that the playwright's landlady may have died of the plague at the age of 40. If so, as in 1564, Pasteurella pestis may have come too close for comfort.

Shakespeare mentions the words “plague" and “pestilence" in his plays, though usually not as the centerpiece of the works. Perhaps the memories are too painful for the citizens of London. But the plague is frequently there in the background, as in real life. Take Romeo and Juliet, for instance. When Romeo's friend Mercutio is stabbed and lays dying, he says “a plague a' both your houses." To modern ears, this is just a witty metaphor, but if you were attending the Globe Theatre then, how would it have sounded? You've seen friends die in days, purulent painful buboes turning lymph nodes into torture chambers. You are scared for your life just thinking about it. Perhaps the phrase would be the equivalent of saying something like, “I hope you both die of cancer." If you lived through 1603, where 15 percent of the population perished in a matter of months, this was no innocent metaphor. It was a curse.

Later in the play, in what will have tragic consequences for the young lovers, a letter is not delivered to Romeo in time. The reason is the plague: “the searchers of the town, /Suspecting that we were both in a house/Where the infectious pestilence did reign, /Sealed up the doors, and would not let us out." Quarantine is a very old idea, and for Shakespeare's audience a living presence.

The plague would not disappear from London until after 1666, when the Great Fire of London, and the subsequent rebuilding of much of the city in brick rather than in thatched roofs, made it less friendly to the fleas and rats that transmitted the plague.

Here's the thing that amazes me: the Black Death first arrived in England in 1348. 1603 was 255 years after that, and the bubonic plague has another half century or more to torment the citizens of England. Two hundred fifty-five years ago was 1767; imagine COVID arriving then and coming back every few years, killing only a few in “good" years and 15 percent in the bad years. And having a half century more to go before disappearing for good. What an intensely depressing thought. But that was the world in which Will Shakespeare lived.

If there is any cheer to be derived from the infectious disease history of Shakespeare's times, those long-ago plague years, let it be this: in 1606, when the Globe Theatre shut down for most of a year, Will Shakespeare didn't goof off. Shakespeare scholars believe that he penned Macbeth, Antony and Cleopatra, and King Lear, all within a 12-month period. Has any writer ever had a better year?

Or, perhaps, consider late 1665. This is the last time that the plague will visit England, but its spread has temporarily shut down Cambridge University. The students are sent home until things cool down in 1666 and classes resume. One of the students, young Isaac Newton, goes home to Woolsthorpe Manor. In 1726, near the end of a not-too-shabby scientific career, he shares this anecdote with his biographer: “After dinner, the weather being warm, we went into the garden & drank tea under the shade of some apple trees… he told me, he was just in the same situation, as when formerly, the notion of gravitation came into his mind…occasioned by the fall of an apple, as he sat in a contemplative mood."

Somewhere, in some lab, or perhaps in a crowded apartment, some grad student or postdoc, who's life has been put on hold by the pandemic, is sitting in a contemplative mood. Newton's apple, I want to believe, has fallen out of the sky and plunked this smart kid on the head and we just haven't heard about it yet. Perhaps finding the cure for cancer or working out the development of the universal vaccine that will end COVID-19 once and for all, but in any event creating something that will transform the world. There is, after all, precedent; the greatest playwright in human history and the human race's greatest physicist both allow us to hope that something good will come out of all this pain and suffering.

Wednesday, March 9, 2022

By George W. Sledge, Jr., MD​

Over the holidays I watched Peter Jackson's documentary The Beatles: Get Back, as well as a separate documentary in which the producer Rick Rubin spoke one-on-one with Paul McCartney about his music. The two shows were a revelation for an old Beatles fan.

In my generation, you thought either the Beatles or the Rolling Stones were the greatest rock band of all time. As a tried-and-true Beatles fan, I consider the years from 1964 to 1969 to be the peak period for this genre. Similarly, the breakup of the Beatles in 1969 was, for many in my generation, a tragedy of epic proportions, somewhere in the range of the sinking of the Titanic.

Jackson's long documentary—an exhausting 9 hours—chronicles the near end of the band's great run. The four musicians are tired of each other, and the band is clearly on its last legs. John Lennon is accompanied to the sessions by Yoko Ono, a simultaneously quiet but obtrusive presence. He has largely checked out of the creative process and is (as we now know) a heroin addict. George Harrison sulks, having spent much of the last few years standing in the background while Lennon and McCartney (the greatest songwriting team of all time) have worked their wonders. He is ready to shine, creatively, but feels ignored, to the extent that he walks away from the sessions, temporarily quitting the band. Ringo Starr has the look of someone grieving over a loss that has not yet occurred but soon will. McCartney, meanwhile, is trying to hold everything together, but in a bossy way that irritates the others.

Around them float various hanger's-on: producers, bean counters, a ridiculous documentary maker who films their every word (often without their knowledge using hidden microphones), managers, technicians, girlfriends (it is not just Yoko who makes her presence felt). The musicians are the center of the creative process, but they rely on a not-insignificant infrastructure.

Soon they will compose their last collective music, and The Beatles will depart the stage: now they belong to the ages, one wants to say of their passing. It is a profoundly depressing piece, a sad tale of breakup as something special falls apart under the strain of a decade of continuous travel, composing, and recording. Though still in their twenties their private lives have begun to diverge, John with Yoko Ono, Paul with Linda Eastman (soon to be McCartney). They are ready to move on. Perhaps inevitable, but maybe not.

Watching them filled me with sadness, and not just because of the band's breakup. We know how all this turned out, and what happened to them in the following decades. Lennon will be murdered outside The Dakota in Manhattan, done in by a mediocrity with a handgun. Harrison, who smokes throughout the sessions, will first develop head and neck cancer, survive that, be stabbed by a deranged fan, survive that, and then succumb to non-small cell lung cancer, developing brain metastasis. Linda McCartney, the love of Paul's life, and a talented photographer in her own right, will develop an aggressive breast cancer, receive both standard and high-dose chemotherapy in the 1990s, and then develop liver metastases. She and Paul will go horseback riding 4 days before she dies at the age of 56. Paul will then go to a very dark place for several years.

The oncologist in me always wonders whether things would have turned out differently had Harrison and Linda McCartney developed their cancers at a later point in time. Would Harrison's cigarette-induced cancer have responded to a checkpoint inhibitor or a third-generation EGFR inhibitor? Would McCartney's aggressive cancer have responded to a novel kinase inhibitor or monoclonal antibody or antibody-drug conjugate, the transplant bludgeon replaced with a precision medicine stiletto? Might it have been cured in the adjuvant setting? Modern medicine is full of “what ifs," such is the pace of modern discovery. So, watching Get Back is filled with anticipatory sadness, both for the lost music and the premature deaths we know are coming.

And yet, and yet… If this is the Beatles at their worst, at their low point, they are still something quite miraculous, conjuring great music out of thin air. Their creative process generally starts with one of them pitching, not a song, but part of a song. They spend hours bouncing lyrics and tunes back and forth, often giving up on it only to come back later, finally finishing triumphantly with something that barely resembles the original idea. Get Back, the documentary's title, is also a recurring work-in-progress song throughout the film, right up until it is presented to the world in a rooftop concert. The song itself is interrupted by the London police, called by uptight neighbors on Savile Row who fail to appreciate history in the making.

Watching them create is, as I said earlier, a revelation. The parallels to the creative process in the sciences are obvious. It goes something like this: the act of creation never occurs in a vacuum or a desert. One puts in long periods of hard work learning the basics of the field. In the Beatles case, this was thousands of hours in bars in Hamburg, Germany, learning their craft. It is the basis for what comes after; you only know what the right questions are once you have spent enough time on the stage.

While history is replete with brilliant individual scientists, my experience in clinical research has been that the best ideas come from the prolonged interaction of talented individuals with overlapping skill sets, the equivalent of having keyboard, guitar, and drum players riffing on a particular musical theme, improvising, altering until they find just the right note, the right word or phrase. The musicians or scientists in turn are supported by a well-developed infrastructure which is, like the Beatles' Abbey Road recording studio, agnostic to the ideas being studied yet crucial to their success: biostatisticians, research nurses, data managers, and the like. And the patients, of course, the patients who lay their lives on the line waiting for us to get smarter.

And sometimes—this happens in the middle of the documentary—one goes outside the original group to add some crucial yet thoroughly unanticipated element to the mix. In the documentary, this is a blues musician Billy Preston, an old friend of the Beatles from their Hamburg days, who drops by to say hello and ends up being conscripted into the act of creation. The joyous energy he brings to the lethargic group transforms their creative process.

There is also a wonderful moment in the documentary when one of the band members wonders what it would sound like to bang on an anvil at a particular point in a song. The anvil mysteriously appears, conjured out of the ether by their road manager Mal Evans, and the sound Ringo Starr makes with it now graces the Beatles' Maxwell's Silver Hammer. Sometimes, the small tweaks matter.

And finally, of course, there is the unexpected. The creative process is essentially messy. It is not something you can apply process engineering to and hope for anything other than mediocrity. The old Six Sigma idea—the rigorous removal of errors from a process, leaving an enterprise clean and pristine—is antithetical to the creative process. In Paul McCartney's interview with Rick Rubin, he said something that stuck with me: “I specialize in audacious mistakes."

I suspect that in the clinical enterprise, as opposed to the laboratory or (sometimes) in the best clinical research, we make far too few audacious mistakes. One can understand why: audacious mistakes attract lawsuits. They fail more often than they succeed. But they are crucial for creativity, and perhaps this is part of why the American health care system is, ultimately, a failure, a disastrous mess providing expensive yet ineffective care. We don't experiment enough, we don't dream enough, we don't audaciously mistake our way to the actually valuable. But we are great at wringing incremental gains out of a system created by a dysfunctional health insurance apparatus designed to maximize enterprise profits for every stakeholder except the patient.

Anyways, that's what I think. But feel free to ignore anything I've said if you are more of a Stones than a Beatles fan. Or if you don't know who they actually were. In which case, I pity you.

Thursday, December 9, 2021

By George W. Sledge, Jr., MD

Sometime around 120,000 years ago, an early member of our species fashioned a set of what archeologists think were clothes-making tools. The tools, fashioned from bone, were recently excavated from the charmingly named Smugglers' Cave in Morocco. They are, as far as we can tell, the first identifiable tools specifically made to produce clothes.

Things moved at a slower pace in those days, so we can safely say that had Popular Science or Scientific American been around this would have qualified as the most important invention of the millennium. If your ultimate goal as a species was to expand from your African heartland to the Eurasian land mass and beyond, then having a clothes-making toolkit certainly qualified as a major achievement.

Of course, Popular Science wasn't around, nor writing, nor any oral tradition we might rely on, so we can't identify the inventor. But we can identify with that nameless person, because making new things, useful things, is what we do as a species. One suggested alternative name for Homo sapiens is Homo faber, Latin for “tool man," or if you prefer, “man the maker." Homo faber was sitting by a fire in Smugglers' Cave. Though it may well have been “tool woman," not “tool man."

I have a fascination with technology. I'm not particularly handy, but tools fascinate me, as do their history. Take the humble screwdriver. The principle of the screw itself has been around a long time, first described (and perhaps invented) by Archimedes in the third century BC, at which time it was used as a screw pump to lift water from the Nile. From ancient Greco-Roman times through to around the 15th century, the principle of the screw was used primarily in olive and grape presses. Perhaps Archimedes, like many basic scientists, wasn't all that great at translating things from brilliant concept to public utility. No VC firms were around in 250 BC. The screw was a great idea stuck in mediocrity, a failure of imagination and inadequate development.

Screws as fasteners didn't come into their own until late in the Medieval era, when they started being used to put pieces of armor together; basically, they were military technology. We get some sense of this in Shakespeare's Macbeth, where Lady Macbeth tells her wavering husband to “screw your courage to the sticking place/And we'll not fail." It's an odd-sounding phrase to modern ears. Literary scholars suggest that the imagery refers to a crossbow, where a string is pulled taut by turning a wooden screw to its fullest extent, hence the “sticking place," before firing the crossbow bolt.

Screwdrivers were invented somewhere in Germany or France. Their first recorded appearance is in the Housebook of Wolfegg Castle, an illustrated DIY handbook written sometime between 1475 and 1490. As with the proto-technologists of Smugglers' Cave, we don't know the inventor.

Early screwdrivers were slotted, or flat-blade. We still have them in our toolkits. Both screws and screwdrivers were little used until the Industrial Revolution, when Job and William Wyatt of Staffordshire, England, created a screw-making machine; by 1780 their 30-worker shop was making 16,000 screws a day. Screws were suddenly cheap and ubiquitous.

The Phillips screw, and the Phillips screwdriver, also to be found in my toolkit, weren't invented until the 1930s. The Phillips screw, named after the American industrialist Henry F. Phillips, is now the most popular one on the planet, widely used in a myriad of industrial processes and in every home. It is the perfect industrial invention primarily because the screwdriver self-centers on the screw, something the flat-blade screwdriver could not do, and therefore does not slide off the screw on a busy assembly line. Phillips convinced General Motors to use the screw in Cadillacs, and the rest, as they say, is history. At least we know the name of the inventor. It wasn't Phillips, who bought it from an engineer named John J. Thompson.

The names don't matter much, of course, but the lowly, homely—literally homely—screwdriver is a pretty good template for toolmaking by Homo faber. It began with a brilliant scientist conceptualizing (or describing or inventing: we're not sure) the screw, but for something like 1,800 years no one could think of any use for it beyond crushing grapes. Its use as a fastener required some unknown tinkerers, one of whom made the first screwdriver.

Even then, it was another three centuries before anyone figured out how to make cheap screws, and yet another century before someone made the intellectual leap to the modern cruciform screw-head and its cognate screwdriver, perhaps its ultimate practical destination. Our tools evolve, often in response to our needs, definitely as part of a larger body of knowledge and experience, through progressive experimentation and tinkering, until they become so commonplace that we never give them a moment's thought.

One of my favorite books is Francis Bacon's Novum Organum, first published in 1612. It outlines the agenda for the modern scientific revolution. I sometimes think that, were every other scientific textbook to disappear, and a mind-wipe cleared the scientific method from our collective consciousness, we would be fine if Novum Organum remained. I'm always surprised how few scientists have ever heard of it. The men who created the Royal Society considered Bacon's work to be the foundation stones of modern science.

Bacon's work is full of pithy aphorisms. “Knowledge is power" is the best-known, though I have always liked “Nature, to be commanded, must be obeyed." But Novum Organum is important primarily because it champions observation and experience over pure reasoning, particularly observation used to test a hypothesis. Central to this testing is our use of tools. Again, quoting Novum Organum: “Neither the naked hand nor the understanding left to itself can effect much. It is by instruments and helps that the work is done, which are as much wanted for the understanding as for the hand."

As much wanted for the understanding as for the hand. I love that phrase. When I think of what has happened to the biologic sciences during my career, and in consequence to the medical sciences, it is the new “instruments and helps" that have mattered the most. Technology is science in action. For better or worse, and sometimes both.

I'm a breast cancer doctor. If I were asked to point to the greatest single advance in my field in the past 30 years, I could point to trastuzumab for HER2-positive disease, the first in a long line of targeted therapies that have transformed the disease. But I would be wrong. The real advance was a tool: monoclonal antibodies made in hybridomas. I was a resident when Kohler and Milstein created the first hybridoma, and (the innocence of those days) refused to patent the idea. By the time I became a fellow, in the early 1980s, my mentors were already discussing all the current uses: as a tool for pathologists and laboratory scientists, and as a therapeutic both in its naked form and as an antibody-drug conjugate. It took a while, but the future was implicit in the act of creation. It was the tool that created all the other tools that made progress possible.

And hybridomas were just the beginning. Polymerase chain reaction (PCR) technology appropriately won its inventor a Nobel prize, as did CRISPR/Cas9 gene editing its inventors. Neither is the equal of the decoding of the double helix in terms of basic scientific discoveries. PCR was a fairly minor riff on what was already known in terms of the biology of life, a cobbling together of existing techniques. CRISPR, let's face it, is important to bacteria but not very important in human biology. It's value, like that of PCR, is as a tool. And what great, transformative tools both are, as much wanted for the understanding as for the hand.

What will be the next great tool in our scientific toolkit? CRISPR was an immediate hit, as were monoclonals. Other technologies may represent slow burns, perhaps not requiring the same time scale as the evolution of the screw/screwdriver dyad, but not overnight successes either. One current technique that fascinates me is the recently described re-engineering of the genetic code to create novel types of proteins. For those of my readers who have not seen this, read either the original article by Robertson and colleagues in Science (2021; 372:1057-62) or the elegant discussion by Liu in the New England Journal of Medicine (2021; 385:1045-49).

The basic idea is simple yet fascinating. As we were all taught in high school biology, the 20 amino acid building blocks are the progeny of 3-nucleotide codons. There are 61 “sense" codons, with multiple codons coding for the same amino acid. This redundancy dates to some early point in evolutionary history but isn't really necessary for life. Robertson and colleagues “liberated" three of these redundant codons and assigned a new, unnatural amino acid to each. This in turn creates novel proteins that have never existed in the history of life. How cool is that?

I don't have a clue where this will lead, though one immediate result is that E. coli engineered to contain “liberated" codons are resistant to viral infection, as the codons that have been liberated are no longer available for “normal" virus-induced protein synthesis. I'm not suggesting that we start CRISPRing “liberated" codons into human eggs, and indeed I have no clue whether this technology will have any practical applications, but such a clever idea is sure to appeal to the scientific wanderlust of an enormous number of molecular biologists. More to come, I suspect.

I began this journey in Smugglers' Cave. Caves play an important role in the history of philosophy. Plato's allegory of the cave suggested that we are like prisoners trapped in a cave in which we only see flickering shadows of reality on the cave's wall: our senses are usually incapable of perceiving the true forms of nature, and those who escape Plato's Cave are feared and hated by those who remain behind. Bacon, in Novum Organum, was undoubtedly thinking of Plato when describing “Idols of the Cave," our individual prejudices that keep us from seeing the world clearly. In a time where science is under constant attack by cable TV pundits, it is easy to believe Plato's sad diagnosis for the human race.

But I like to think that human history is the history, not of Plato's dismal cave, nor its idols, but of Smugglers' Cave, where from the beginning progress has been driven by our ability to invent and create new tools. Those tools have, admittedly, been a mixed blessing for humanity, but our creative ability is what defines us as a species and represents our best hope for the future. Homo faber is always finding new things in Smugglers' Cave. 

Wednesday, September 22, 2021

By George W. Sledge, Jr., MD​

There is a concept in the material sciences called either “fatigue limit" or “endurance limit"—the terms are used interchangeably—that we who care for cancer patients should adopt, or at least explore. Imagine holding a copper wire in your hands, and then bending it back and forth, cyclically, until the wire breaks. This process is called “fatigue failure." The fatigue limit (or endurance limit), in turn, is the stress level below which fatigue failure does not occur. A related term, “fatigue life," is the number of stress cycles of a specified character that our copper wire sustains before failure occurs. This is determined by the properties of the material (copper fatigues more readily than titanium), the amplitude of the stress, internal defects, temperature, corrosion, and several other qualities.

This is a big deal for engineers and, therefore, for the rest of us. Fatigue failure caused by cyclical stress is, we are told, responsible for nine out of every 10 mechanical failures. There is an American Society for Testing and Materials (ASTM) that devotes its considerable efforts to determining (among other things) the limits of endurance. I visited the ASTM website, where the search term “endurance limits" rewards you with 1,017 results, starting with “STP1372 Fatigue Crack Growth Thresholds, Endurance Limits, and Design." The term “fatigue limits" gives you over 3,000 results. Material scientists work to improve product life and safety by minimizing fatigue. That planes do not fall out of the sky as often as they used to is a measure of their success.

Google “limits of human endurance" and a popular result is a 2019 article by Caitlin Thurber and colleagues, published in Science Advances. It is not at all what you might expect when you hear “limits of human endurance." The authors are looking for the physiologic limits to maximum energy expenditure. It turns out that we can expend a great deal of energy for a short time, but if the time frame is long, the amount of energy one can expend is limited.

They explored this in a creative way, beginning by measuring the basal metabolic rates (BMR) of endurance athletes. Athletes can expend up to 9.5 times their basal metabolic rate for a half-day triathlon. But if the same athletes take part in the 140-day transcontinental Race Across America, which is essentially a marathon (42.2 km) per day, 6 days per week, for 14-20 weeks, maximum sustained metabolic scope (the limit of human endurance) drops to ~2.5 times the BMR. Put mathematically, maximum sustained metabolic scope is not a static value, but instead follows a strong negative logarithmic relation with event duration. Beyond the 2.5 BMR limit, we increasingly draw down our body's energy stores.

One can wear out acutely as well as chronically. Exceeding the endurance limit is intrinsic to marathons, as witnessed by the event's (possibly apocryphal) origin story. In 490 BC, Athenian soldiers bested Persian invaders at the battle of Marathon, and sent their best runner, Pheidippides, back home to deliver the news. Pheidippides had just run 150 miles from Athens to Sparta in 2 days, carrying the Athenians' request for aid and then had run back with their response (“No, not yet"). Now he ran 25 miles from the battlefield at Marathon to the city, telling the anxious city elders “nikomen" (“we win") before collapsing and dying. If you wear Nike running shoes or watch the Olympic marathon, remember Pheidippides, but don't emulate him: stop short of your endurance limit.

Interestingly, the BMR endurance limit equation applies to pregnant and lactating women as well. Pregnant women are, in a very real metabolic sense, long distance runners, their maximum sustained metabolic scope peaking at around 2.2 times BMR. This in turn probably limits the size of a newborn child. The authors suggest that these limits of human endurance, both for runners and pregnant women, are due to an alimentary limit: our guts just can't absorb enough calories per day to get beyond the BMR limit. The oncologist in me wonders whether an actively metabolizing cancer beyond a certain volume—the sort that lights up a PET scan—encroaches on that limit. I'm sure someone has studied it, though I've been unable to find a paper that clearly relates SUV values to glucose consumption for a particular tumor volume.

So, a copper wire has an endurance limit, and the human body does as well, and both are related to the level of the stress applied and the duration of that cyclical stress. But when I think of the limits of endurance I tend to think not in terms of physical or metabolic limits, but in psychological ones. How much can a human being endure?

Let's travel back in time a hundred years or so to World War I, the first true modern war. After its opening moves in 1914, the war settled down into brutal trench warfare across northern France, where miserable men spent weeks in muddy trenches, constantly bombarded with massive shells thrown up by the opposing forces' artillery. Those armies experienced something previously unseen. Nineteenth century battles such as Waterloo or Borodino or Antietam were won or lost in a single day, and even the longest, like Gettysburg, might only last 3 days. The requirement for courage under fire was a short-term thing. But on the Western front, where the lines barely changed over 4 years of combat, men suffered from what was then called shell shock. Today we might lump it in with post-traumatic stress disorder.

The classic work on this is Lord Moran's The Anatomy of Courage. Moran (created a lord during World War II for his services to Great Britain) was later to become Winston Churchill's personal physician and served as a medical school dean and as head of the Royal Society of Medicine; but as a young regimental surgeon in World War I, he cared for numerous shell shock victims. Early in the war, he was asked to examine a listless sergeant who had just come back from the trenches. Moran writes:

“I found him staring into the fire. He had not shaved and his trousers were half open. I could get nothing out of him…he did not appear to be ill. We agreed to let him rest, to let him stay in his billet until his battalion came out of the trenches. But next day when everyone had gone up to the line, he blew his head off. I thought little of it at the time, it seemed a silly thing to do."

Subsequently Moran, like other physicians who discussed shell shock, came to see it as a matter of psychological attrition rather than the result of a single causal event. To quote him, “The story of how courage was spent in France is a picture of sensitive men using up their willpower under discouraging circumstances while one by one their moral props were knocked down. The call on the bank might only be the daily strain of the trenches or it might be the sudden draft that threatened to close the account."

Moran visited the “sudden draft that threatened to close the account" when he experienced bombardment himself. “My mind became a complete blank. I had a feeling I had suffered physical hurt though I was not touched, the will to do the right thing was for the moment stunned. I was dazed and at the mercy of those instincts which I had until then been able to fight." Under the repetitive stress of such shelling, he said, “men wear out like clothes."

Moran spoke in terms of courage, but courage may not be the right word for what he chronicles. What he described was endurance, as human copper wires were subjected to cyclical stresses, surpassed their endurance limit, and broke.

Which brings me, inevitably, to my clinic. I've discussed material fatigue, and metabolic limits of endurance, and shell shock, and they all make me think about my patients with metastatic triple-negative breast cancer. This disease has all the attributes of those previous things, and more.

When I sit down with a patient with metastatic triple-negative breast cancer, I tell her that the nature of her disease is that left untreated it will grow, and grow until it consumes some vital organ, ending her life. The only way to prolong her life, for cure is generally not in the cards, is with repetitive cycles of toxic chemotherapy agents, sometimes augmented with repetitive cycles of minimally less toxic checkpoint inhibitors or an antibody-drug conjugate.

So it happens that my patient, like some World War I soldier, faces a life in the trenches, being continuously shelled. I will be asked, early on, “how many treatments will I receive?" And my answer is always the same: that depends, depends on the effectiveness of the therapy and on your ability to tolerate it. But this is, I say, a marathon, not a sprint. You need to maintain your physical and emotional stores of energy. We talk about drug holidays when things get to be too much, about altering dose and schedule of our therapeutics in response to side effects, about how it is important to listen to your body, to adjust in response to circumstance. One has to operate in terms of what Leo Tolstoy suggested in War and Peace: “A man on a thousand mile walk has to forget his goal and say to himself every morning, 'Today I'm going to cover twenty-five miles and then rest up and sleep.'"

We also talk about goals of care and end-of-life strategies. I was not at all good at this early in my career, but have come to recognize it as one of the more important things I do as an oncologist, and I've tried to do it better, though I still fail all too often, and fail my patients as a result.

What I am miserable at, by analogy with the material sciences, is predicting the endurance limit and fatigue life of my patients with advanced disease. The endurance limit, you will recall, is the stress level below which fatigue failure does not occur. The fatigue life is the number of stress cycles of a specified character that one can sustain before things fall apart. The material scientists would say “before structural failure," but of course it is the treatment that fails, not the patient.

I wish I had some way of gauging this ahead of time, of being able to know what the endurance limit of therapy was for an individual. But just as—to invoke the wisdom of the material scientists—fatigue life is determined by the properties of the material, the amplitude of the stress, internal defects, temperature, and corrosion, so do many individual physical, mental, spiritual, and social properties preceding metastasis help determine the limits of human endurance. My patients also deal with the financial toxicity of treatment, family stresses, fear of impending death, uncertainty over the timing of that event, and the million other small things in daily life that drag us down. Your 2-year-old does not become any less needy because you harbor a liver metastasis, your unhappy marriage is not improved by the stress, and the bills still need to be paid on time. Sometimes it can be too much.

And all these are layered on top of the amplitude and repetitive nature (or in our usage dose, schedule, and number of cycles) of capecitabine or paclitaxel or eribulin or pembrolizumab or sacituzumab govitecan. If, as I have suggested, a large, rapidly dividing cancer shoves one up above that 2.5 times BMR limit of human endurance, especially in the context of the poor food intake and cachexia of a patient with metastatic disease, does the human body start breaking down?

Lord Moran thought that courage was a bank account that you could only draw on for so long. I am astonished, even terrified, by my patients' courage, even as I see their accounts dwindle. Clinical trial reports measure neither a patient's resilience nor her exhaustion of those resources—physical, mental, spiritual—that make her struggle possible against such daunting odds. But we all measure those resources, roughly so, often inaccurately, when deciding whether that next treatment holds value.

I have yet to encounter the algorithm that integrates all these factors, but the limits of endurance are as real for my patient as for that thin copper wire bent repetitively, and regularly come into play before I run out of FDA-approved drugs. There is a science to this that we do not yet understand, and a challenge we have yet to meet and beat.

Wednesday, August 11, 2021

By George W. Sledge, Jr., MD​

Someone will recommend a book, or I'll read a good review, and I'll start looking for it. I am a voracious reader and would have filed for bankruptcy years ago if I had to pay for every book I read. I used to go to the library, but in the era of COVID-19 I turned to online apps provided by my local libraries—Libby, Hoopla, and Axis 360. I frequently end up frustrated. My local library's version of Libby, for instance, has 32,000 eBooks and audiobooks, but apparently never the one I want right now.

How many books are there? Put another way, for the online resources to mimic Borges' universal library, how much would its offerings need to expand? Google tells me that the total number of published books since Gutenberg is around 130 million, and over 2 million per year are added to the listings worldwide. My library's holdings are several orders of magnitude off these numbers.

Google, of course, famously wanted to put all books online, or at least all the older ones, before running into the wrath of the publishers. The company is a major proponent of the oft-expressed idea that “data wants to be free" and is expert at monetizing that data. As recently as this week, as I write this, Google was fined $593 million in France for failing to negotiate in good faith with publishers over profit-sharing for the news publishers provide. Previously, the company had been in a long battle over copyright as a result of scanning older books without the permission of publishers or authors.

These battles encapsulate a basic modern tension. It is (at least I believe it is) in the public good to acquire, collate, distribute, and analyze large datasets, including books. Yet these datasets were originally someone's intellectual property, paid for with sweat or money, and the original owners (or original acquirers, not necessarily the same thing) rarely want to give that property away for free so that others earn billions of dollars off their hard work. Nor is every use to which this data is put equally virtuous.

A previous generation would have considered it odd to think of Anna Karenina or the Sunday Times as digital assets to be monetized, yet that is exactly what they are, and we all got comfortable with that fact as the world migrated online. That we still have a New York Times is largely due to this phenomenon, just as the inability of local newspapers to monetize their digital assets has contributed to their demise. And, one might add, to the demise of American democracy.

I used to think of this as falling into that broad category of “other people's problems." Medicine seemed to be relatively recalcitrant to, or at least significantly isolated from, the monetization of digital data. Fortress Healthcare had several ramparts around it: HIPAA regulations, lack of interoperability between health systems' electronic health records, the virtual impenetrability of those records, their sheer volume, and the absolute meaninglessness of much of the data being collected. Even a computer doing machine learning will get bored analyzing CBC's and chem profiles from my clinic, and the best natural language processing programs will be stumped by the grammatical excesses of my colleagues, though not of course by my dictations. And let's not even talk about the inherent messiness of complex biologic systems, both at an individual and collective level.

Early efforts in this field were largely unsuccessful, with IBM pouring billions into a failed attempt to rationalize cancer care through its Watson program, mostly succeeding in irritating the trustees of Memorial and MD Anderson. Watson's computer engineers, like most engineers believing the universe to be an ultimately rational place, were disabused of that peculiar notion by the American health care system in general and the field of oncology in particular. This cast somewhat of a pall on the exercise.

But this is changing rapidly, with an efflorescence of attempts to extract monetizable data from the health care system, and those rapidly increasing in scope as the big money piles in. Let me share an incomplete listing to give you some flavor of what is happening.

Since I've already mentioned Google, a good place to start is Google Cloud's interaction with HCA Healthcare, a Nashville-based giant with, to quote the press release announcing the collaboration, 32 million annual patient encounters, 93,000 nurses, and 47,000 active and affiliated physicians. The partnership will make use of Google's health care analytics, including BigQuery, described as “a planetary scale database," to make sense of this huge dataset. One wonders about the need for planetary scale, since HCA is pretty much an American operation, but I'm sure Google has grand plans for the entire solar system, should HCA ever open a clinic on Mars. Google Cloud has also partnered with Mayo Clinic. The press releases wallow in corporate clichés such as “transforming healthcare," a phrase which always makes me nervous when applied to my clinic, along with (per Google Cloud's CEO) “being an accelerant for innovation." Accelerants, you will recall, are used by arsonists when they want to burn down your house.

Somewhat closer to home, and easier for a poor oncologist to understand, is Roche's purchase of, respectively, Flatiron Health and Foundation Medicine. Flatiron's goal, according to its website, is to solve the problem caused by the real-world clinical data which is “unstructured and stored across thousands of disconnected community clinics, medical centers and hospitals." Its website tells us that Flatiron works with over 280 cancer centers and seven “major" academic medical centers (because why would they work with a minor academic medical center?), as well as more than 20 drug developers, and has access to almost 3 million patient records. And not just access to the records: Flatiron sells OncoEMR, its cancer-specific electronic health records platform, to practices, then mines data from the platform.

If Flatiron analyzes other people's data, Foundation Medicine creates genomic data. I certainly use their services, both for clinical and research purposes, and I am not alone. Foundation has analyzed over a half-million tumor samples from cancer patients, providing useful—or occasionally somewhat useful—genomic data to help guide care. That's what most oncologists consider Foundation's raison d'etre. But its website touts its data analytics and describes how it uses them to partner with more than 50 biopharma companies. My library's Libby app has 32,000 books available, which is quite a bit of collected knowledge. Foundation's library contains the book of life for a half-million souls, which is in its own way also quite a bit of knowledge, or at least the beginnings of knowledge. You can do a lot of number-crunching with a half-million cancer genomes.

Roche paid $2.4 billion for Foundation Medicine and $1.9 billion for Flatiron Health. Never underestimate the Swiss when it comes to buying in the health care space, as anyone who remembers Roche's purchase of Genentech will attest. But there is something fascinating about a Big Pharma company spending over $4 billion to acquire what are, in essence, two premiere Big Data companies. My thoughts are that Roche certainly believes it can monetize the data developed by its purchases, as well as using the data to generate novel therapeutic agents for cancer patients. And tying together lab scientists at Genentech with large genomic and clinical databases certainly has its attractions. I don't have a clue what the return on investment for Flatiron and Foundation will amount to, but a single drug development hit would wipe out those sunk costs in a hurry.

Of course, a large health care company can directly purchase a practice network and mine it for data. The classic example of this is McKesson, which bought the US Oncology Network back in 2010. McKesson describes itself as “the oldest and largest healthcare company in the nation" and is a behemoth with $231.1 billion in revenues last year. One of McKesson's divisions is Ontada, launched last year with a press release that—surprise—uses phrases like “transforming the fight against cancer." But it's a data analytics company, using information garnered from patients treated in the US Oncology Network to “help life sciences companies leverage evidence-based data and insights to accelerate innovation." Yet another accelerant.

And then there is Microsoft Cloud for Healthcare, which has partnerships with the Walgreens Boots Alliance, Providence Health System, Humana, and Novartis. Among other things they are going “to transform the exam room by deploying ambient clinical intelligence solutions that capture, with patient consent, interactions between clinicians and patients so that clinical documentation writes itself." OK, that admittedly sounds like an improvement on EPIC documentation, but what else will those ambient clinical intelligence solutions do? Dictations first, but eventually something more intrusive, I'd bet, and definitely something that involves coding for billing purposes. I could go on and on, but I'm getting tired of transforming accelerants.

We mustn't imagine that there is anything inherently nefarious in any of these arrangements. If data analytics allows one to generate new agents that tackle dangerous cancers, prolonging and improving the lives of my patients, that's for the collective benefit of society. It's a better use of Big Data analytics than, say, their role in the 2016 election. Once large, connected medical databases existed, someone inevitably was going to dissect them with the intent of extracting further knowledge and revenue from the health care system. That's the modern digital age, the age of artificial intelligence and machine learning, and there's no escaping it.

But all this does raise issues. The data being mined is coming from patients, and the patients receive no direct benefit from the use of their data. Whether they receive indirect benefits depends on the purpose to which their data is used. But it is a reasonable guess that when a new drug comes on the market based on mining genomic data, there will be no patient discount for having contributed your genome and clinical outcomes. Nor are you ever really asked how your data should be used, or whether you wish to share it with some corporate entity. At most you might be asked to sign some form in clinic giving permission for unnamed others to do whatever they want with the most personal aspects of your being, albeit with appropriate HIPAA protections.

If those protections still mean something, which is open to question, the deidentified data used by the data analysts is supposed to be safe from a privacy standpoint: just subtract a few crucial facts, and no one should be able to identify you. Maybe, maybe not. Luc Rocher and colleagues, in a 2019 Nature Communications paper, demonstrated that “99.98% of Americans would be correctly re-identified in any dataset using 15 demographic attributes." This was true even for what they term “heavily incomplete" datasets. Privacy is a relative term, ultimately depending on the goodwill and care of data analysts I have never met, many of whom seem to reside in St. Petersburg.

What impresses is how rapidly all this is happening, and happening essentially without oversight or public input. Will tomorrow's oncology clinic be radically different from todays? Will an ambient intelligence (whatever that is) listen in, creating my notes through some synthesis of my interaction with the patient and the existing electronic health record? Will that patient receive a new blockbuster drug resulting from a prior patient's genomic sequencing? Will my clinic be run more efficiently and safely as a result of data analytics? I can imagine all these things, or none of them, but I suspect I'll find out soon. In the meantime, I picked up some good books at the local bookstore today. I think I'll read them now.