Musings of a Cancer Doctor

Wide-ranging views and perspective from George W. Sledge, Jr., MD

Friday, September 21, 2018

My wife and I were walking down the street recently when we came upon an old black Ford, lovingly kept up by its owner. I recognized it immediately as an early 1950s model of the sort my parents owned when I was a child. The Antique Automobile Club of America defines "antique" as a car over 25 years of age, so I guess this was an antique, but for a moment I was transported back to when it was new. And I was new, and not the antique I am now.

My first conscious memory was of waking up in the back seat of our black family Ford. I remember the car as having a high ceiling, though that memory may only reflect my own short stature. Cars in those days had no seatbelts, and there were no car seats for children, so I must have been sleeping on the back seat. I remember waking up with my mother and father talking to each other up front, and I remember the car stopping and getting out, and I remember literally nothing else happening before that experience. Even at the time, I remembered remembering nothing before that moment.

When was your first conscious memory? I must have been about 3 years old. Studies of first memory give ranges between 1 and 4 years old, with a plurality occurring at age 3, so I was distinctly average. The folks who study consciousness for a living say that "first conscious memory" is not the same thing as "first evidence of consciousness." That boundary has been pushed back to 2 months of age through the use of functional MRI studies. Still, I like to think that waking up in the back seat of a black Ford sedan as the moment that my consciousness, my own sense of myself as an independent individual, blinked on.

There are numerous—too numerous—definitions of consciousness, though I like the one used by Wikipedia: "the state or quality of awareness, or, of being aware of an external object or something within oneself." Consciousness is an exceptionally contentious subject, argued over endlessly by philosophers, biologists, neurologists, physicists (quantum theory—don't even ask), and just about everyone else at some time or other.

The philosophers have been thinking about this the longest and have the most interesting arguments. Descartes based his theory of philosophy on what he viewed as the solid rock of consciousness: cogito ergo sum, I think therefore I am. At the other end of the spectrum, a modern philosopher, the ever-entertaining Daniel Dennett, belittles what he calls the "Cartesian theater" of consciousness, viewing it as little more than an epiphenomenon, a largely irrelevant by-product of complex neural wiring in higher organisms.

As has happened so often in science, philosophy has given way to measurement. Whereas 20 years ago the science of consciousness was essentially nonexistent, at least in the sense understood by working scientists, today it is a robust field of discovery, propelled by novel technologies such as functional MRI scans, electro- and magnetoencephalography, combined with imaginative experimental designs. Several theories of consciousness have emerged from this work, the premier among which—as far as I can tell—is something called the global neuronal workspace theory.

The global neuronal workspace theory emerges from studies of visual inputs. Flash an image for a brief period—50 microseconds or less—and the visual cortex will light up on an fMRI, but nothing else happens. Longer exposures to the image light up not just the visual cortex but multiple areas of the brain. The image is now being shared across the global neuronal workspace, and multiple areas of the brain now have access to the input. This, in simple (overly simple—I'm not a neuroscientist) is the basis of consciousness. Anyone wishing a deeper understanding of consciousness should read Stanislas Dehaene's wonderful book on the subject. A competing theory, the integrated information theory, is also championed by many neuroscientists.

The parts of the brain involved in consciousness are quite specific, and particularly involve the cerebral cortex in the posterior portions of the brain: the parietal, occipital, and temporal lobes. You can lose your entire cerebellum and not have it affect consciousness.

Early philosophers and biologists separated humans from "lower" animals based upon their ability to form conscious thought. Current neuroscience isn't so sure, extending consciousness to many vertebrate species. Anesthetize a chimpanzee, paint a red spot on part of its anatomy it usually doesn't see, and when the animal wakes up in a room with a mirror, it will touch the spot, even moving the mirror to get a better view. This mirror self-recognition test is not a perfect marker of self-awareness, but the animals that pass it tend to be clustered at the upper end of the animal kingdom, neural capacity-wise: bottlenose dolphins and killer whales, bonobos, orangutans, Asian elephants, and Eurasian magpies; perhaps dogs. Human children typically pass the test at around 18 months of age.

And then there is the octopus. The octopus doesn't pass the mirror self-recognition test, though they do freak out when they see themselves in a mirror. But the octopus seems really smart to me. Peter Godfrey-Smith in his wonderful book, Other Minds, calls the octopus "the closest thing we'll ever see to an alien intelligence." The octopus has the brain capacity of a small dog, with around 500 million neurons (we have a bunch more). Weirdly, three-fifths of the neurons in an octopus are found, not in the brain, but in the arms.

Octopuses do not live very long: an elderly octopus is 3 or 4 years old at most. But in their prime they seem self-aware, and quite aware of their environment. They can identify, remember, and target individual humans in an aquarium environment. They are escape artists. They participate in play behavior. They are tool-users. They communicate with each other, in part, through chromatophores located in their skin. Think of having a conversation conducted in changing patterns and colors of skin pigment. And they are invertebrates, a half-billion years separate from our common evolutionary ancestor. They've evolved consciousness independently of us and our higher vertebrate relatives. They are deeply weird, and ineffably cool.

But back to us. We lose consciousness every day. We call it sleep, that pre-programmed disappearance—aside from those vivid dreams that sometimes intrude on the next morning's memory—of that conscious state that normally defines us. We think nothing of it, yet it is really quite wonderful and mysterious. One minute I am snoring, the next my alarm clock or a pager goes off and within seconds (OK, maybe a minute) I am conscious. We also induce loss of consciousness with regularity. Indeed, it's pretty much the definition of what it means to be a general anesthetic. Nothing wrong with that. The brain is good at re-initializing its consciousness programming.

It is the disappearance or alteration of consciousness during daytime and with disease that bothers us. Drugs and disease both disrupt consciousness. Indeed, while the average doctor would have trouble defining consciousness, we are pretty good at gauging its loss: the Glasgow Coma Scale, with its measures of eye, verbal, and motor responses, is widely used by neurologists.

There are so many ways that the body can go wrong. Cancer doctors regularly deal with altered consciousness. Sometimes it's a brain metastasis causing an epileptic fit, the electric spasm spreading through the brain, shutting off consciousness, the brain then rebooting like some errant computer. Sometimes it has been a drug effect: too much of this sedative or that chemotherapeutic. All too often it has been the end result of liver failure, the body's metabolites reaching toxic levels, the patient slipping off into that final sleep from which none awake.

I remember (I am conscious of) standing by my father's bed as he died. In his case it was hypoxia that eliminated consciousness, aided by morphine sulfate.

He had metastatic prostate cancer, his bones riddled with castrate-resistant, taxane-resistant, agony-inducing disease. He had been fading for some time, the cancer gradually robbing him of independence, of energy, and of any realistic hope. I had visited 2 weeks before for his wedding anniversary, then gone back home to see to my patients. It had been a sad visit. My brother asked me how long I thought dad had to live. Two weeks, I said. Probably not more. Sometimes we know too much.

The proximate cause of his dying was by a pulmonary embolus. He had a deep venous thrombosis in his left leg nearly a year before, abetted by a long drive from Madison, Wis., to Bonita Springs, Fla. My dad was ever an impatient driver, keen to get from point A to point B in the shortest time possible. He had not stopped a second longer than necessary, had not gotten out of the car to walk around, and the result was that he spent the last 10 months of his life on warfarin. So, the call from my brother, announcing that dad had a pulmonary embolus and was in the hospital dying came as no particular surprise.

I was in Indiana at the time, my parents in Wisconsin. I got the message in the middle of my Tuesday outpatient clinic. I called a colleague for coverage (thanks, Kathy, I still appreciate it), left work, got in my car, and drove straight to Madison.

My father was a religious man, far more so than his son, and not afraid of death. But he fought for life. He was not impatient to see what was on the other side. The day he died, he asked his oncologist whether there were any clinical trials he might enter. This was the late '90s, and anti-angiogenic therapy was all the rage, and the University of Wisconsin was scheduled to be one of the first sites for a phase I trial of endostatin, the drug a certain Nobel Laureate famously said was going to cure cancer within 2 years. My dad had told me, more than once, that if only he could hang on long enough he might yet beat the disease that inexorably consumed him.

I know, optimism bias. His performance status was such that no clinical trial in the land would have had him, and both I and his oncologist told him so, months before he lay on his deathbed. But he was a fighter, and though things were hopeless he never gave up.

When I got to the hospital, dad was severely short of breath, hyperventilating and clearly dying, and his oncologist did what kind oncologists do in such a setting: he turned up the oxygen and turned up the morphine. My dad had drifted in and out of an opiate haze for weeks, one minute drowsy, the next alert. But now he was mostly awake, conscious almost to the end.

In the last few minutes, something odd happened. At least it seemed odd to me then, and still does now. He opened his eyes, and they darted around the room, never resting for more than a second or two on any spot. It was disconcerting. Every now and then they would touch on my face, pause briefly, and move on. I had the feeling he was looking for something, something unobtainable, something…not there.

Was that the last flicker of consciousness, or just a dying brain's random electrical impulses? I'll never know. But then it was over, his consciousness utterly gone.

We speak of the soul fleeing the body, and perhaps that happened too, though that is a matter of belief rather than of science. But consciousness, at least to the extent we understand it, involves massive parallel processing of information via widely dispersed neural networks, a thoroughly natural electrochemical process, but also a highly fragile, easily disrupted one. Turn off the lights and the signaling goes away, and with it consciousness.

Are gliomas conscious? A very strange thought: could brain tumors take part in the conscious process, act as part of the global neuronal workspace? It sounds like something out of a horror story. Brain metastases certainly affect conscious function all the time. My Uncle John's smoking-induced lung cancer, upon spreading to his brain, turned that calm, church-going Baptist into an irritable spewer of obscenities. But that was loss of function, disinhibition due to frontal lobe damage. I don't think small cell lung cancer is capable of demonic possession.

The breast cancer doctor in me thinks of a well-differentiated tubular adenocarcinoma of the breast, where the cancer does its best to act like a milk duct. Could a well-differentiated brain tumor have analogous functionality, pretending conscious connectivity? Unsurprisingly, the medical literature contains no hint of such capability, and frankly I'm happy for that: it is a creepy thought, a brain hijacked by something malignant, with malignant consequences.

Charles Whitman, the 1966 Texas Tower shooter who killed 17 people and wounded 96 in America's first mass shooting—back when such events were rare—knew that something was wrong. He was, he wrote in his suicide note, the "victim of many unusual and irrational thoughts. These thoughts constantly recur, and it requires a tremendous mental effort to concentrate on useful and progressive tasks."

He requested that an autopsy be performed after his death. A pecan-sized mass, later ruled to be a glioblastoma multiforme, was discovered. Neurologists argued then, and ever since, whether it was a contributing factor in the massacre. Hard to say. But I doubt the glioma was consciously altering Whitman's consciousness.

Great evil doesn't require a physiologic explanation, though that might give us some comfort, some belief that we can explain and therefore control. We're quite good at hijacking consciousness: addictive drugs, paranoid politics, and fanatical religion being prominent examples. So leave glioma mind control to science fiction, please.

I'll stick with that old black Ford, and the comforting voices of my parents sitting in the front seat. It's a pleasant memory.

Monday, May 21, 2018

Oncology Times is 40 years old this year, a healthy middle age for a print journal devoted to a medical specialty. I started reading it as an Assistant Professor, and enjoyed it for its ability to provide an overview of my chosen profession, as I still do. I read the New England Journal of Medicine, and Nature, and the JCO, each in its own way valuable, each serving a somewhat separate purpose. I have always found Oncology Times useful for taking the pulse of my profession, and for filling me in on advances in diseases I don't see in my clinic. And yes, for the gossip—pardon me, the news—about my colleagues around the country.

I asked our editorial staff to send me the oldest edition they had, so that I might get some sense of how things have changed. The journal has moved about over time, and in those moves, apparently older issues were misplaced. The oldest issues they could find were from 1989, already—at Volume 11—a well-established eyewitness to the world of cancer. I'm sure that somewhere, in the dusty stacks of some medical library, Volume 1 exists.

Or maybe not. Perhaps the first 10 volumes have ceased to exist. In our ever-expanding digital universe, we often catch only brief glimpses of the early universe. Many journals ceased to exist before the internet became the ubiquitous source of knowledge, and as such to most of the current world never existed. Those journals that failed to digitize their early years have also, for practical purposes, eliminated those years from the global consciousness. And given how journals come and go, how long before even many of today's works disappear, their servers no longer maintained by disinterested or extinct publishers?

But enough philosophical maunderings. What did 1989 look like to the Oncology Times?

What impresses, and what I forgot, was how much the period was tinged with AIDS. The entire decade, in fact, was the decade of AIDS in oncology, with spillover into the '90s. Prior to the control of HIV with HAART regimens, AIDS-related malignancy was a growth industry for cancer doctors.

On the very first page of Issue 1, one sees an article devoted to AIDS-related Kaposi's sarcoma, treated with alpha interferon. Subsequent articles discuss the use of MRI for brain infections in AIDS, the WHO global AIDS program, CD4 as a decoy for HIV, GM-CSF for neutropenia in AIDS patients, AIDS-related lymphomas, AIDS vaccine research, and that's just through June of 1989. Every issue had a page devoted to "AIDS Briefs." Oncology Times still occasionally discusses AIDS-related cancers, but they have become rarities, not the career-builders and (increasingly) life-enders they were in that awful decade.

Though I've always been a breast cancer doctor, there was a brief period where I treated patients with Kaposi's sarcoma, though I was never a true AIDS-focused doctor. What I remember most from that era was the story of Ryan White. My current fellows probably don't know that name. Ryan White was a teenager from Kokomo, Indiana, a little north and east of Indianapolis. He had received a tainted blood transfusion for his hemophilia, and then was diagnosed with AIDS late in 1994. He was, shall we say, treated poorly by the good citizens of Kokomo, who banned him from public school and generally abused him with a long litany of indignities. He became a national cause célèbre, the face of AIDS. He lived, and died, bravely.

I would occasionally see him on the IU Medical School campus, where he received much of his treatment and where he died in 1990, just a few years short of HAART and the transformation it created. I have had friends and colleagues with hemophilia, and that period was absolutely terrifying for them. After Ryan White died, the U.S. Congress passed the Ryan White CARE Act to provide health care funding for AIDS patients. Anyone who doubts the reality of medical progress need only remember AIDS in 1989.

And, of course, AIDS-related malignancies were not the only cancer problems on the rise in 1989. Several articles discussing the rising tide of lung cancer deaths, the inadequacy of therapy for the disease, and the need for greater preventive measures. A guest editorial by William Tipping of the ACS, "Taking Tobacco's Measure," is a reminder of what we have accomplished in the smoking cessation arena. In 1989, Congress was voting on legislation limiting smoking on flights greater than 2 hours, almost unimaginable today. The airlines were balking.

On a lighter note, one of the pleasures of reviewing old issues of the Oncology Times is the opportunity to visit old friends. I hate my old pictures. I have never been even minimally photogenic, or maybe I just really look that ugly. I suspect, perusing Volume 11, that some of my colleagues share this statement. There were a lot of really funny-looking moustaches that my friends (I will not name/shame them) no longer wear. And my generation was so young then!

And the older generation? There were giants in those days, and many made it into Oncology Times that year: I see pictures of Bob Young (then still young and just-appointed president of Fox Chase), Bernie Fisher, Larry Einhorn, B.J. Kennedy, Richard Peto, Janet Rowley, Peter Nowell, Syd Salmon, Josh Fidler, and many others. Many of these have passed from the scene, and many are only remembered by aging oncologists soon to retire. We are a field strangely uninterested in our own history, but reading Volume 11 is a reminder, not only of how far we have come, but of the visionaries who brought us to where we are today.

The drug picture was also quite interesting. You see this primarily through the ads. Oncology Times, then and now, survives on advertising revenues, and ads are a great place to look at new drugs. As a breast cancer doc, I see ads for tamoxifen: while the drug had already been around for a decade; adjuvant hormonal therapy was still relatively new. But no HER2-targeted drugs, though there is an article on HER2, primarily as a potential prognostic factor. Trastuzumab was not yet even a twinkle in Genentech's eye, and most of the drugs I use today in the metastatic setting, even the chemotherapeutics, are missing. From a chemotherapy standpoint, a full-page advertisement for Adriamycin bragged "ADRIAMYCIN IS NOW 97% PURE," which says all you need to know about the relative therapeutic poverty of the era.

Today, oncologists can draw on an immense number of agents. My boss, a cardiologist, tells me that most of the drugs he uses are off-patent, which is a foreign thought to today's cancer doctors and the source of both our patients' hopes and economic woes. But in 1989, the biotech revolution was in its infancy, tamoxifen was pretty much the only targeted therapy for cancer, there were no "ibs," no "mabs," nothing we currently take for granted. Even modern supportive care drugs have not yet entered practice: the anti-emetics ads are for Marinol and Compazine.

Oncology Times had a wonderful business reporter at the time, named Robert Teitelman, whose beat was the biotech business. His articles discuss companies I had long forgotten, such as Chiron, Integrated Genetics, Hybritech, NeoRx, and Centocor. Then, as now, the big fish were busy eating up the minnows, and most of these names disappeared into the welcoming arms of some larger company. What differed was the cost of the buy-outs, in the tens of millions as opposed to today's billions. There are hints of what's to come, with a scramble to produce monoclonal antibodies, and lymphoma mentioned as a prominent target, as well as IDEC as a company interested in the area, though no rituximab is yet on the scene. Oh, for a time machine and access to a stockbroker (you couldn't yet order your shares online).

There are other promising notes in that year's issues. The 1989 Nobel Prize for Physiology or Medicine went to UCSF's Michael Bishop and Harold Varmus, for their demonstration that the Rous sarcoma virus was in fact a mutated normal cellular gene. This discovery led to the oncogene revolution, with far-reaching effects for diagnosis and treatment of human cancer. It was clear, even then, that the way forward would require the application of that biologic evolution to the clinic, though it was also evident that the path upward was difficult, twisty, and covered with obscuring undergrowth.

Some things haven't changed. The November issue reports a survey of oncologist's attitudes towards their specialty. Oncologists in 1989 stressed about their patient's high mortality rates, and their own long hours and inadequate work-life balance. But they love their patients, and obtain personal satisfaction from offering them help, even if that help paled in comparison to what we can offer today. Other articles discussed issues we still face: the cost of quality care (if only we knew then how cheap things actually were, and how expensive they were destined to be), nursing burnout and retention, health disparities related to access, race and poverty. In other words, plus ça change, plus c'est la même chose. But the progress is real. I wonder what Volume 80 will say? I hope there is no Volume 80, or no need for it. But until then, Oncology Times will still have a proper place. Read on, for there are wonders ahead.

Friday, January 19, 2018

I am a sucker for end-of-the-year list. You know the type: Best Books of the Year, Best Movies, Best Albums, Best Songs, Best TV show, Best Whatever. I read them religiously, especially the Best Books lists. There are even lists of lists, collating the "bests" by number of lists in which a particular body of work is to be found. George Saunders's fascinating Lincoln in the Bardo is the list of lists book winner, if you haven't read the novel. For what it's worth, the Sledge family agreed: three copies of the novel were bought independently and placed under the Christmas tree. Maybe we all read the same lists.

Though I like the book lists, and find them convenient for holiday shopping, I find many of the books themselves unreadable, a problem they share with National Book Award winners. "Literary" books don't have to be dry tomes emanating from academic writer's workshops, though all too many are. And the lists, of course, are created by individuals with interests, biases, and their own agendas.

I remember, back in the day, the intelligentsia's vicious takedown of President Dwight David Eisenhower: the man read mystery novels in his free time. How smart could he be? It was the sort of criticism that appealed to me when I was a college student, but one that lost its charms over time. Crime and Punishment was a pretty good mystery novel, after all, and a great number of crime novelists can write rings around their "literary" counterparts, offer more acidic and pertinent societal diagnoses, and provide far more hours of enjoyment to more human beings, fulfilling the Benthamite prescription for happiness.

Then there are the science lists, the ones with the most important breakthroughs of the year. I've always been partial to the Science magazine list. Its breakthrough of the year involved the collision of two neutron stars on the edge of the galaxy NGC 4993, detected first by the LIGO gravitational wave detector and ultimately studied by (according to Science) 3,674 researchers from 953 institutions.

Nature magazine's list of the "10 people who mattered this year" is also an interesting list. I think they really wanted to say "the 10 most important people in science," but that would have attracted unwelcome attention from the next 287-odd people who thought they belonged on the list. The list includes, to our shame, an American political leader both disinterested in and destructive of science.

I've contributed to such lists in a minor way, having come up with my "Top 10" for oncology in general or breast cancer in particular. Looking back on such lists, whether my own or others, reveals embarrassing flaws in judgement. What I thought was important wasn't, or was but for the wrong reason. I remember 2011, the year everything changed in oncology with the advent of CTLA-4 inhibition for melanoma. My thought, at the time, was something along the lines of "How cool. Those melanoma docs really needed a new drug." Again, right, but for the wrong reason: checkpoint inhibition, not just CTLA-4, and cancer, not melanoma; rather like mistaking the French Revolution for a small riot that preceded it.

But in oncology, the real lists that matter are created by the FDA: the annual lists of New Drug Approvals. Experimental findings, the stuff of New England Journal of Medicine editorials, only really signify when a drug makes it into the clinic. I count 57 of what the FDA calls "Hematology/Oncology (Cancer) Approvals and Safety Notifications" for 2017. By comparison, if one looks at the 2007 list, the number of approvals was 17. Of course, not all of these are individual drugs (drugs could be approved for more than one disease), and not all of them are new drugs (some old dogs have been taught new tricks), and not all of them are even drugs (more on this in a minute). But the sheer increase in number over the past decade suggests that cancer research has increased in pace and scope and (one hopes) in impact.

Looking over the list provides an interesting overview of the state of the field. A dozen of these approvals were for checkpoint inhibitors. The immuno-oncology band marches on, with new treatments for common cancers (non-small cell lung cancer (NSCLC), bladder cancer, Hodgkin lymphoma), and for more obscure malignancies (Merkel Cell, anyone?). Similarly, the "ib" revolution, with kinase inhibitors for specific oncogenic drivers, is well into its second decade with no sign of abatement. The past year saw a combination "ib" approval (dabrafenib and trametinib for metastatic NSCLC with BRAF V600E mutation), similar to a prior 2014 melanoma approval. Shutting off BRAF alone doesn't do much, as we learned in melanoma: whack-a-mole oncology rears its head in short order, and a combined approach of BRAF and MEK inhibition provides better (though still imperfect) results. I suspect we will see more of these over time as we tease out cancer biology in lab and clinic.

The diagnostic space was also interesting. The dabrafenib/trametinib BRAF V600E approval, for instance, comes with the caveat "as detected by an FDA-approved test." The approval tells its own story. The BRAF V600E mutation is rare, somewhere on the order of 1-3 percent of NSCLC. Had you told a Novartis exec 2 decades ago that we would be intentionally targeting 2 percent of lung cancer with two targeted therapies informed by a companion diagnostic, the reaction might have been an interesting one. But such is the way of the world in 2017, and no one finds it at all surprising.

The larger companion diagnostics story involves so-called "tissue-agnostic approvals." The first of these occurred in May 2017, and it is a big deal. The FDA approved the use of the checkpoint inhibitor pembrolizumab "for adult and pediatric patients with unresectable or metastatic, microsatellite instability-high (MSI-H), or mismatch repair deficient (dMMR) solid tumors." Prior to this, the FDA approved agent X for disease Y; for every X there was a Y. But with MSI-H or dMMR tumors (Lynch syndrome cancers and their relatives), the organ no longer matters: it's the biology that counts. "Y" has changed.

How often will we see such tissue-agnostic approvals? Not very often, I suspect: much, or even most, of tumor biology is context-dependent. EGFR-inhibitor drug resistance is different in colorectal cancer than in lung cancer, and the presence of an oncogenic mutation in a cancer doesn't always define that cancer as being a good target for a particular kinase inhibitor. But we will see some.

In 2017, the FDA gave two drugs (entrectinib and larotrectinib) Orphan Drug status for treatment of NTRK fusion-positive solid tumors. If your cancer has an NTRK fusion protein, you have a great chance of responding to one of these agents regardless of organ. Alas, NTRK is uncommon. But the Orphan Drug status is interesting, for both drugs are the result of phase II basket trials. Sometimes you get lucky: most "positive" basket trials are positive in one or two disease buckets, not across the board as seen with NTRK.

Another diagnostics story is also of genuine interest, and perhaps greater ultimate importance. Late last year, the FDA approved the FoundationOne CDx test "to detect genetic mutations in 324 genes and two genomic signatures in any solid tumor type." This is a device, rather than a drug, approval. Device approvals require a different type of evidence than drug approvals, in that they do not carry the same requirement for clinical benefit. But this approval is very broad ("any solid tumor type") and comes with a proposed national coverage determination from the Centers for Medicare & Medicaid Services. So, basically, the government just approved next-generation sequencing for everyone with an advanced solid tumor.

This is another of those milestones that will pass essentially unnoticed. And yet it answers an important question. For several years after the Human Genome Project announced completion of the "rough draft" sequencing of our DNA in 2000, there were real questions regarding its utility. Seventeen years later, genome sequencing has moved from a multi-billion-dollar one-off to an off-the-shelf, relatively cheap (relative to the therapeutics, that is) FDA-approved diagnostic device. Exactly what physicians will use it for is an interesting question, but as the MSI-high and NTRK fusion stories suggest, this will rapidly become not just an option for patients but a requirement for patient care. For while MSI-high and NTRK are individually rare, like all medical zebras, no one will want to miss them if they can be tested for with ease. The convergence of next-generation sequencing with the immuno-oncology and precision medicine ("ib") revolutions is now complete.

One hesitates to call FoundationOne (and the onslaught of next-gen diagnostics to follow) a "companion diagnostic." One orders it to find a "companion therapeutic," which is conceptually something quite different than ordering ER and HER2 for breast cancer.

In most years, the additional members of the list would be viewed as impressive additions to the therapeutic armamentarium. The FDA approved two additional CDK 4/6 inhibitors (ribociclib and abemaciclib) for breast cancer and approved pertuzumab for adjuvant therapy in breast cancer. The first PARP inhibitor approval occurred, with niraparib providing stellar results as maintenance therapy for ovarian cancer, a spin-off of the 1990's BRCA diagnostic revolution.

We also saw the first FDA approvals of CAR T-cell gene therapy for acute lymphoblastic leukemia and advanced large B cell lymphoma, an exceptionally cool, "gee-whiz science" technology with curative potential for some cancers. CAR-T is not for the general medical oncologist, at least not yet. It has the boutique feel of the bone marrow transplant unit, with expenses to match: Novartis launched its CAR-T product with a price of $475,000, which of course is not the real price for anyone once one adds hospitalization, in-house processing, and drug administration costs. You know you are in trouble when the press release touting a new therapy uses the cost of allogeneic stem cell transplantation as a reference standard.

If 2017 was both an exceptional year for "Oncology Lists," at least in the context of FDA approvals, the very length of the list is chastening. How does a general medical oncologist keep up with 57 FDA approvals? I used to think that being a primary care physician was the toughest job in medicine, but being a general medical oncologist is giving it a run for its money. We have not yet found a good way to transmit this firehose of information to cancer doctors, who are drowning in data.

And what impact will the 2017 list have on cancer patients? Here the picture is even murkier. A plethora of new $10,000 per month drugs cannot be good for health care system economics, or family finances, unless such drugs actually wipe out a patient's cancer. Few of these do. One of the most embarrassing, depressing medical articles of the last year, published in the BMJ, reviewed the impact of 5 years' worth of EMA (the FDA's European equivalent) approvals. The EMA approvals (from 2009-2014) overlap significantly with contemporaneous FDA approvals.

The results were not pretty. The EMA list, which was heavy on precision medicine kinase inhibitors (the "ib" drugs) improved the overall survival of patients with metastatic cancers by an average of only 2.7 months where this endpoint could be measured in a phase III trial (BMJ 2017;359:bmj.j4530). Most drugs on the list were approved with a progression-free survival endpoint instead of an overall survival endpoint, rendering the data even drearier. Quality of life was demonstrably improved in only 10 percent of the new drug indications. So, we have quite a way to go.

That is the nature of lists. Like the book lists I obsess over, many of the individual therapeutic listings will disappoint, money poorly spent on a product I won't like. But some things on the list will have lasting value, and the sheer length of the list indicates the rapid pace at which science is entering the clinic. Like many cancer doctors, I am a pathologic optimist. I believe the future will work for my patients. I'll keep looking at the lists.

Tuesday, November 21, 2017

Recently Elon Musk, serial entrepreneur and visionary industrialist, sounded the alarm over the perils of artificial intelligence (AI), calling it our "biggest existential threat." Musk is no technophobe, no misguided Luddite, but rather a perceptive observer, and a bold visionary, so many think he is worth taking seriously. His argument, and that of those who agree with him, goes something like this: the ultimate result of modern information technology will be to create an autonomous artificial intelligence, a creature of servers and the web, whose intelligence equals or outpaces our own. That creature, capable of transmitting its base code to any other server, soon has in its grasp all the resources of the internet.

At that point, the autonomous intelligence essentially controls the world: bank accounts, Snapchat, Facebook, nuclear missiles. Borg-like, resistance becomes futile: object and your bank account is drained, weird pictures of that party where you drank too much suddenly show up in your boss's account, fake news suggesting you are under the control of aliens is transmitted to everyone you have ever friended. And if you persist in rejecting the AI's new world order, North Korea launches a tactical nuke at you while you are vacationing in Guam. You might be forgiven for being unable to distinguish between the AI apocalypse and 2017.

I've made fun of what is actually a serious intellectual argument. Anyone interested in that argument should read Nick Bostrom's Superintelligence, an entire tome devoted to the problem. Superintelligence has led to intellectual fist fights in the AI community, with responses ranging anywhere from "idiotic" to "frighteningly possible." Regardless, it is a good (though not light) read for those wanting to get some understanding of the issues involved.

One of the issues, if I read this literature correctly, is deciding on what constitutes autonomous, higher level, scary, Terminator-Skynetechelon Artificial Intelligence, or even defining lower level AI. AI proponents note that whenever some AI project works, and is incorporated into standard processes, we cease to think of it as AI. It's just code that is a little bit smarter than last year's code and a little quicker than this year's humans. And individual toolkits, like the one that Watson used to win at Jeopardy, will never be mistaken for higher level intelligence.

We don't even really understand our own intelligence all that well. If I, Google Translate-like, but using my baccalaureate French ("perfect Midwestern French," per my college professor; not a compliment) translate "peau d'orange" as "skin edema," am I just using an algorithm programmed into my neocortex, a bundle of interconnected neurons firing off a pre-programmed answer? And is all that constitutes my intelligence nothing but a collection of similarly algorithmic toolkits, residing in synaptic connections rather than semiconductors, just programmed wetware?

And if so, how many toolkits, how many mental hacks are required for a computer to equal or beat human intelligence? And if you combine enough of them, would they be distinguishable from human intelligence.

A single human brain is something quite wonderful. The most recent analysis I have seen, from a group working at the Salk Institute, suggests that a single human brain has a memory capacity of a petabyte, roughly equal to that of the current World Wide Web. This both reassures and concerns me. On the one hand, it will be a while before the collective intelligence of some AI creature is greater than that of several billion humans, though even somewhat stupider AIs could still create a lot of mayhem, like some clever 12-year-old. On the other hand, I have been using that old "limited storage capacity" excuse for my memory lapses (missed anniversaries, appointments, book chapters, etc.) for some time now, and this is, unfortunately, no longer a good excuse.

Part of why we might want to take the idea of superintelligence seriously is the rapid recent progression of AI. A good example of this involves Google Translate, a classic Big Data project. For a very long time, Google's ability to translate from one language to another, or anyone else's for that matter, was severely limited. The results mirrored that old Monty Python skit where a Bulgarian/English dictionary offers hilariously mistranslations that endanger the speaker (YouTube). But Google translation abilities have improved tremendously, the result of a massive brute force approach that now allows relatively good translation into almost every language attached to a written language. But impressive as this feat is, it is not higher-level AI: Google translate still doesn't write my blogs, in English or Bulgarian, though it could well translate between the two.

AI remains a minor force, verging on nonexistence, in the medical world. The largest effort, IBM's Watson, has been an expensive dud. Its major effect so far has been to assist in the resignation of the president of MD Anderson Cancer Center. The Anderson Watson project, designed to offer clinical decision support for oncologists, was plagued by mission creep, financial bloat, missed deadlines, physician loathing, and ultimate technical failure. An initial $2.9 million outlay turned into a $65 million boondoggle, and claims of breaking multiple University of Texas system rules. Early failures, of course, do not prevent future successes, but as President Trump has noted, no one (at IBM or MD Anderson) apparently realized how complicated health care actually was.

I can't say I'm surprised. Much of what makes health care complicated is, at least currently, not soluble with higher order computational strategies. In an average clinic, moving from one room to the next, I will speak to patients who want too much of modern medicine (diagnostically and therapeutically), followed by patients who reject what little I have to offer, both for reasons that seem unreasonable to me, and I suspect to a computer-generated algorithmic intelligence. I know what the right thing to do is, if by the right thing one means improving disease-free survival or response rate or overall survival. It is the unquantifiable and the personal that torpedo my best efforts, sinking what should be optimal outcomes in a sea of doubt and confusion. An irritating nanny program, manifesting itself in annoying pop-up messages while I am already wasting my time navigating EPIC, is unlikely to improve my mood or my patient's lot.

Other, more modest, efforts continue apace. The IT start-up Flatiron and ASCO's CancerLinQ both plan to liberate the EHR via AI measures. As a conflict of interest statement, I was ASCO's President the year the ASCO Board committed to the creation of CancerLinQ, and I served as Co-Chair of its initial oversight committee. As such, I feel a residual interest in its ultimate success. Whether ASCO, or some private market-based approach prevails, is not something I will bet on, either for or against. But I do foresee something approaching AI-based clinical decision support in our future. Still, this is not higher level AI as I understand it.

I don't know that what I want, or what most doctors want, is some omniscient superintelligence telling me how to practice medicine. My interests are much more modest: could the AI listen in on my conversation with the patient and create a decent clinic note in EPIC? Could it put in orders for the PET/CT I discussed with my nurse? Could it print out a copy of that JCO or NEJM article I was misremembering, highlighting the crucial piece of data I tried conjuring from the dusty cupboards of my memory? Could it automatically tell me what clinical trials my patient is eligible for without me having to go through the protocol? Could it go through my EHR and certify me for QOPI or recertify me for the American Board of Internal Medicine or give me 2-minute snippets of education between patients that would result in CME credit? Could it automatically pre-cert patients for insurance authorizations? Before a superintelligent AI becomes Skynet-scary and ends human civilization, let's hope it at least passes through a brief useful phase in the clinic.

These tasks all seem reasonable projects for companies capable of translating Shakespeare into Swahili. Some of them represent intellectual aides, making me clinically smarter, but many are just time-savers. And time, of course, is every doctor's most important commodity in the clinic.

Even limited AI could have fairly profound effects on the way medicine is practiced. There are already machine learning programs that outperform radiologists in their ability to read mammograms, are the equivalent of dermatologists (which means, superior to me) in recognizing melanomas, and are gaining rapidly on a pathologist's ability to read an H&E slide. Algorithms are great at pattern recognition. Note to radiologists, dermatologists, and pathologists: beware. You are ultimately as disposable as any of those other industries digitalization has transformed.

But back to superintelligence. Should we panic over the ultimate prospect of our servant becoming our master? I view this as an extension of a very old conflict. This conflict is over who controls our lives. Throughout most of recorded human history, humans have been under the control of some tribal chieftain, feudal lord, king, dictator, or petty bureaucrat. It is only in the past 2 centuries or so that a significant portion of the human race has had anything approaching personal or political freedom.

I worry that the period that began with the signing of the Declaration of Independence, a period of immense human creativity, is coming to an end. Human freedom is under assault around the globe, and so far the assaults have been launched not by supercomputers but by the rich and powerful, by ex-Soviet oligarchs, by religious fanatics, and by impartial market forces that care nothing for human liberty. That these forces use computer algorithms to keep the weak and powerless weak and powerless is unsurprising.

If, say, a political party uses computers to design gerrymandered maps to limit the voting strength of those it views as supporting its opponents, then it would be unsurprising if an AI creature failed to learn this lesson in its inexorable march for dominance. If an insurance company used Big Data/AI approaches to maximize its profit by limiting its liability to the poor and sick, why be surprised if some future AI master has no more care for human suffering? If a foreign power uses computer technology to undermine democracy by flooding the internet with disinformation, why expect tomorrow's superintelligence to do anything other than mimic us? Maybe we need to deserve our future freedom by fighting for it in the present, before it is lost to forces that are all too human.

At some level, I view a superintelligent AI as something almost…human. By this I mean a creature with a degree of insight that goes beyond mere pattern recognition. Something capable of strategizing, or scheming, perhaps even capable of hating, if silicon-based life forms can generate emotions similar to carbon-based ones.

There is a large science fiction literature devoted to AI, long since turned into Hollywood movies. One thinks of HAL in 2001: A Space Odyssey, or of Arnold Schwarzenegger's Terminator. These AIs are either maleficent, like Schwarzenegger's cyborg in Terminator, or controlling, like the AI in the Matrix series, or (rarely, because it is so unfrightening) benign and transcendent, as occurs in Iain Banks' wonderful Culture novels. Ants don't understand human intelligence, and we might be like ants to an AI superintelligence. The point, in all of these possible scenarios, is that the inflection point could, in theory, occur in a picosecond.

Those who worry about such things (Musk, Bostrom, and others) say that the time to start working on the problem is now, because 10 or 20 years from now may be too late. As I type these words, pictures of post-hurricane south Texas, Florida, and Puerto Rico crowd the airways: cities without fresh water that are underwater. We are not good at preparing for natural disasters; I doubt we would be any better at prepping for an AI apocalypse. A kinder, gentler alternative suggests that we might proactively program AIs for beneficence, as with Isaac Asimov's three laws of robotics. Though if AIs are allowed to watch certain networks, they may decide humanity unworthy of saving.​

Monday, September 25, 2017

By George W. Sledge, Jr., MD

There's an old joke, a poor but telling one. The late novelist David Foster Wallace used it in a college graduation speech, but it's much older than that. Two young fish are swimming along and run into an old fish. "Good morning," says the old fish, "isn't the water fine today?" And then he swims on. One of the young fish turns to the other and asks "What's water?"

We swim in history yet are so caught up in the day-to-day that we rarely notice its swirls and eddies, unless strong currents buffet us: a war, some momentous election. Looking back, we suddenly recognize how far downstream we've come. I look at my life, a totally ordinary one lacking in historical impact and think, well, what times I have lived through. I was born in the middle of the Korean War, was in middle school and high school during the Civil Rights era, went to college in the midst of the Vietnam War, saw (on TV) the first landing on the moon, the impeachment of Richard Nixon, the fall of the Berlin Wall, the two wars in Iraq, 9/11, and a myriad of lesser events.

The last Civil War veteran died when I was four. When I was a grade-schooler, a World War I veteran with a loud hacky cough did house work for us. I remember someone saying his lungs were a mess ever since he had been caught in a poison gas attack in 1918. When I was in college, I met a woman who as a child had been placed in a lifeboat and rowed away from the sinking Titanic.

As a medical student, I was on the ward with an elderly man patient who had played jazz with Louis Armstrong in the French Quarter. As an intern, I helped take care of Bill Drake, an 82-year-old former pitcher in the Negro Baseball leagues. His baseball moniker was Plunk, for his habit of throwing beanballs at batters who crowded the plate. Another patient that year, a Spanish-American War veteran, died of bradycardia while I stood at his bedside administering an antiarrythmic. This has always left me feeling guilty given what we subsequently learned about antiarrythmic usage. A patient of mine had, as a teenager, sung at the White House at Christmas time for President Franklin Delano Roosevelt.

And that's just some politics and culture. But the science, oh the science, how amazing: my life has overlapped with Watson and Crick's description of DNA and the subsequent molecular biology revolution, the deciphering of the immune system, the creation of monoclonal antibodies, the Human Genome Project, The Cancer Genome Atlas, and the discovery of CRISPR/cas9. I have argued previously, and I still believe, that 500 years from now it is the science that will be remembered if there are still humans around to remember such things. And I haven't even touched on physics, chemistry, the computer revolution, and several others whose histories happen to coincide with my own, including most of what makes up modern oncology.

None of these things, at the time, seemed particularly extraordinary, and I certainly cannot claim any credit for even one of them. At most I thought "that's interesting" before moving on to another task. It is the cumulative weight of these things that makes me stop and wonder. We swim through history and rarely think about the fact that we are living it. I have not lived a particularly extraordinary life, but I have lived in extraordinary times.

I write this a few days after the Republican attempt at repealing Obamacare went down in (what appears to be) its final defeat. It has been a dramatic week, and an important one for the health care system and many of my patients. It has been one of those rare times where I actually felt, in the moment, that I was "living in history," if you will.

It got me thinking about the contingent nature of history, and its relation to cancer. The Affordable Care Act (ACA) as a political event is thoroughly entwined in one type of cancer. When winding its way through Congress in 2009, the Democrats had a large majority in the house and a filibuster-proof 60-member majority in the Senate. Then Senator Ted Kennedy, a passionate supporter of national health insurance, developed a glioma, setting off a complicated sequence of events that contributed to the poorly-written piece of legislation that eventually became the ACA, and no doubt contributing to its lasting unpopularity.

Seven years later, and largely as a result of the unpopularity of Obamacare, the Republicans hold a substantial, if gerrymandered, majority in the House of Representatives, a small majority in the Senate, and the Presidency. Having spent 7 years promising the immediate repeal and replacement of the ACA, and having passed a "repeal and replace" bill in the House, they fell one vote short in the Senate.

As in 2009, cancer proved to be an important part of the story. Senator John McCain, 80 years old, former war hero, and recently diagnosed glioblastoma multiforme patient, cast the deciding vote against repeal in a dramatic, made-for-TV late-night vote. Would he have done so without the diagnosis of glioma? Did his cancer diagnosis, fortuitously coming right before the vote, alter history?

Psychologizing from a distance is dangerous even for a psychologist, but even more so for an oncologist, so I'll try. Did the recognition of his own impending mortality allow McCain the freedom to break from party ranks, the freedom that comes when fate renders you no longer answerable to this year's dogma or the next election cycle? Did the diagnosis render him more sympathetic to poor people with cancer? Or, less charitably, was the proud naval veteran offering some payback to the President who had foolishly and inexplicably impugned his heroism and attacked his character? Or perhaps all three, for we need not have a single motive. A fourth possibility, raised by one of McCain's "friends" in the Senate, was that the tumor affected the senator's judgement, perhaps because he was tired when he voted.

We do know that Joe Biden, whose son Beau died of glioma, called McCain on the day of the vote. Why is glioma, of all cancers, such a huge part of this story? Biden did not compete for the 2016 Democratic nomination because of his son's diagnosis, with unknowable consequences for American history.

I suspect the cancer had something to do with McCain's vote. Certainly, the other member of the Senate with advanced cancer, Senator Mazie Hirono, brought a special passion to the health care debate derived from her life-threatening disease. McCain has been coy, saying only that he thought his vote was the right thing to do.

We see things through our own special lenses. A Marxist would point to history as a determinist juggernaut, events being largely independent of personalities. I've never believed that for a moment, though some things, in particular those involving technology, seem to have a life of their own, almost independent of any individual's desires or efforts. Such technologic imperialism aside, some historical decisions come down to one vote, and one voter wrestling with the consequences of that action.

Senator McCain's recent diagnosis, leaving aside its political impact, has interesting additional aspects. I for one am always happy that I am not the doctor facing the press after a prominent politico receives a horrible diagnosis. These doctor/spokesmen are in an awful position. Like all doctors, their primary responsibility is to their patient, and their patient's desires regarding transparency must be taken into account. I would add that, while physicians are well aware of the gloomy statistics for a particular population of cancer patients, many of us (myself included) suffer from an optimism bias for individual patients, and that bias is probably amplified by a reporter's microphone. Against these motivations, the public's right to know can take second place.

This can lead to some interesting sins of omission, and occasionally sins of commission. For instance, when Ted Kennedy's glioma was diagnosed, his doctors were very careful, and generally quite honest, about the diagnosis, but utterly quiet regarding prognosis. Their statement included, "He remains in good spirits and full of energy." As if that mattered.

More recently, McCain's doctors mentioned that "Scanning done since the procedure...shows that the tissue of concern was completely resected by imaging criteria." Think about what that sentence says, how it says it, and what it implies. First, that horrible, mealy-mouthed euphemism, "tissue of concern," when what they mean is "cancer" or "glioma." Second, the phrase "completely resected by imaging criteria," a relatively meaningless phrase implying a good outcome in a disease famous for local recurrence.

A physician saying "we got it all" in misleading medicalese changes nothing, and the press was not conned by this minor obfuscation. McCain's prognosis, the prognosis of patients with glioma, was made painstakingly clear in the many reports I read. McCain's political rival in the Arizona Republican primary, a physician, wished McCain well and told him to resign from office so that she could be appointed in his place. Presumably this is because, having been rejected for office by members of her own party, she therefore deserves to be a U.S. senator, and anyway McCain is toast. Classy.

The worst example of this sort of thing occurred with Massachusetts Senator Paul Tsongas, diagnosed with a non-Hodgkin lymphoma. Tsongas, viewed as a potential Democratic presidential candidate, underwent high-dose chemotherapy and bone marrow transplantation for his disease, the marrow being prepared with a selection method designed to eliminate cancer cells. After this procedure, in announcing his 1992 candidacy, Tsongas had his Harvard physician (and active political supporter) offer up the comforting claim that he was likely cured and therefore a viable candidate for completing a term of office. In fact, Tsongas had already suffered an axillary lymph node recurrence post-transplant. Not only was his candidacy nonviable, within 3 years he would die of recurrent disease. Dana-Farber's chief physician, in reviewing the case, would later say that the institution "made a bevy of mistakes."

History provides other examples of the intersection of cancer and politics. Perhaps the weightiest occurred in 1888. Kaiser Frederick III of Germany was, as German autocrats went, kind, intelligent, liberal, and pacific by nature. He also chain-smoked cigarettes and developed cancer of the larynx. Originally misdiagnosed (by Virchow, of all people), and subsequently under-treated—being the German Kaiser doesn't protect you from bad care if the pathology is wrong—he eventually died, miserably, after a reign of only 99 days.

His son, Wilhelm, was everything Frederick was not. He had a withered left arm—Erb's palsy due to a breech birth—that left him shy and insecure. And like some shy and insecure people, he over-compensated, getting into a naval race with Great Britain, antagonizing the French and Russians through his militarization of the German Reich, acting bellicose and paranoid, and eventually approving the attacks that started World War I.

His dislike of the Western democracies (particularly Britain) was partly ideological, and partly personal: in one angry outburst shortly after his father died, he said "an English doctor killed my father, and an English doctor crippled my arm—which is the fault of my [English] mother." Talk about mommy issues. How different the 20th century might have been had his father not been a chain-smoker, or had the physicians been a little bit better at their jobs. Historians cannot imagine the pacific anglophile Frederick getting into the same mess his son Willy created.

World War I was the 20th century's keystone event, leading to (in no particular order) the rise of the U.S. as a world economic and military power and the eventual collapse of the British empire, the collapse of the Romanov Dynasty and its replacement by the Bolsheviks, the kick-starting of multiple military technologies (such as the tank, the submarine, and the airplane), the collapse of the Austro-Hungarian Empire, the embitterment of a generation of Germans (including the young Austrian emigre Corporal Hitler, who drew his own conclusions regarding the war's lessons), and the collapse of the Ottoman Empire and the subsequent creation of the many Middle Eastern nation-states whose fractious histories continue to bedevil the 21st century. Before World War I, there was no country called Iraq, no country called Syria, no Palestine mandate leading to Israel, no Saudi Arabia.

After the Kaiser died, his doctors all stuck their scalpels into each other, and quite publically. The English surgeon wrote an inflammatory book blaming the Germans, and lost his medical license as a reward. There was, reading the accounts, plenty of blame to go around, with both pathologists and surgeons making a mess of things. This one cancer death may have been the most consequential in world history, and I certainly would not want to have been one of Frederick's physicians, taking credit for the whole bloody, tragic 20th century.

Not all cancer deaths are so consequential, of course. Most just affect the patient, the patient's family and acquaintances, and the patient's health care team. But that is more than enough. Our private histories need not rival Frederick's story to count as tragedies.