Home Archive Blogs Collections Podcasts Videos Info & Services
Skip Navigation LinksHome > Blogs > Musings of a Cancer Doctor
Musings of a Cancer Doctor
Wide-ranging views and perspective from George W. Sledge, Jr., MD
Monday, March 03, 2014

I thought that there could not have been any better year for genomics than 2012, but the last year or so has matched it. The falling price of genomics, the development of new analytic techniques, and the inventiveness with which genomic technologies are applied have colluded in the creation of wonderful new insights. There is hardly an area of biology or medicine that hasn't benefited in some way.

Let's look at the study of human origins. We learned, just a few years ago, that most non-sub-Saharan moderns carry a touch of Neanderthal in their genome. This discovery is largely the work of a group led by Svante Pääbo at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Pääbo has also found evidence of other admixtures (the mysterious Denisovans who make up part of the modern genomic complement of Pacific islanders). 

 

It would have appalled our Victorian ancestors, as well as the many 20th century racial ideologues wedded to a belief in the purity of their particular ethnic group, to learn of this unwelcome ancestry. But I found it a charming thought.

 

If 1.5% or so of our genome come from Neanderthal ancestors, the interesting question is "which 1.5%?” Is it a random set of genes, or, as Darwinian theory might predict, a set that increased our biologic fitness? A first-pass look at our Neanderthal inheritance suggests the latter: definitely non-random, definitely selected for (or against). Two recent publications (in Science and Nature) suggest that there was positive selection for genes involved in skin and hair -- particularly the type II cluster of keratin genes on 12q13 -- suggesting that our “Out of Africa” Homo sapiens ancestors benefitted from Neanderthal cold climate genes. It turns out that when we say that someone is “thick-skinned” we are implying Neanderthal ancestry.

 

At the same time, there was significant negative selection for X chromosome (a five-fold reduction) and testes-specific genes, as well as reduction of Neanderthal DNA in a region of the genome containing a gene thought to play an important role in human speech and language. The authors suggest that human-Neanderthal hybrids had reduced fertility, resulting in elimination of the X chromosome genes causing the infertility.

 

Well, OK, but so what? Why should anyone really care, other than in the “gee whiz” way, which scientific curiosities of no particular current importance appeal to intelligent onlookers? Because, once again, William Faulkner was right: the past isn’t dead. It isn’t even past. The researchers identified nine variant genes associated with specific disease-linked traits, including (to quote the Nature paper) “alleles of Neanderthal origin that affect lupus, biliary cirrhosis, Crohn’s disease, optic-disk size, smoking behaviour, IL-18 levels and type 2 Diabetes.”

 

Many of my breast cancer patients dread the idea that they have passed on a mutation to their children that will doom them at some future point. The sense of guilt can be overwhelming, that fear of a dark cloud hanging over those they hold most dear aligned with the certainty that they were the cause of their children’s future grief. I tell them that it is certainly not their fault, any more than it was their parents or their parents’ parents. And the good things they have given their children -- life, love, home, and family -- certainly far outweigh the bad. But this view of the things they pass on rarely seems to assuage the guilt, nor the sadness that attends it.

 

BRCA mutations may date back a millennium or two. But to extend the disease-causing mutation story back several tens of thousands of years (the Neanderthals became extinct some 30,000-plus years ago, and the interbreeding is thought to have occurred 40-80,000 years ago) is just astonishing. 

 

Try explaining to a lung cancer patient that he couldn’t quit smoking because of a gene he inherited from an extinct species: doomed by your inner cave man, if you will. Or, perhaps, some weird cosmic revenge for wiping out the Neanderthals?

 

Human history is rife with examples of one population violating the space of another, but we rarely think of these interactions in genomic terms. A new analytic technique called Globetrotter changes that. Produced by researchers at Oxford University, and published recently in Science, Globetrotter uses modern human genomes to identify ancient population migration patterns. The authors call these “admixture events,” a polite term for what was often a very messy, very ugly history. You can take a look at these at the Globetrotter website at http://admixturemap.paintmychromosomes.com

 

The authors reasoned that if someone from population A hooked up with someone from population B, then their offspring would share the genetic history of both parents. And then, over time, genetic recombination would occur, allowing one to clock how long ago the genomic mash-up happened. A fairly simple idea, but an enormous amount of thought and work went into making Globetrotter a reality.

 

A decade ago studies of the Y chromosome suggested that a relatively large percentage of selected Asian populations descend from Genghis Khan. Globetrotter confirmed this analysis, looking, not at the Y chromosome, but at approximately a half-million SNPs throughout the genome in some 1500 individuals from around the world. The Mongol Horde was not some group of peace-loving picnickers camping out on the Silk Road: Genghis and his kids were very, very bad people, and we have the genetic evidence to prove it.

 

Globetrotter also defined several other admixture events. To quote the paper: “We identified events whose dates and participants suggest they describe genetic impacts of the Mongol empire, Arab slave trade, Bantu expansion, first millennium CE migrations in Eastern Europe, and European colonialism, as well as unrecorded events, revealing admixture to be an almost universal force shaping human populations.” Maybe those unrecorded events were peaceful ones, but the ones we know about were not. Universal force, indeed.

 

These are first revelations from Globetrotter, and no doubt others will emerge. It seems pretty clear that much of prehistory (and a fair amount of history) may need to be re-written in the next few years. Could anyone have predicted this synchronicity of history and biology a couple of decades ago? Will any legitimate historian, going forward, be able to ignore science? And, as the Neanderthal data suggests, will we MD’s be able to ignore the lessons of anthropology?

 

There were a few other recent genome-based history lessons worth mentioning. In 2012 one of the coolest stories involved the discovery of Richard III's bones under a parking lot. I wrote a blog post about it at the timeand the Shakespeare lover in me still finds this ineffably awesome. Now comes the suggestion that the bones of King Alfred the Great, or perhaps his son, have also been found. The evidence here is less compelling (unlike the Richard III story, there are no known modern descendants to compare genomes with) but still intriguing.

 

I note in passing that the Sledge family’s ancestral bones remain to be discovered. I’m not holding my breath. But the news from a 12,600 year-old burial site in Wilsall, Montana gives me some hope. Looking at the genome derived from the only human burial associated with the Clovis culture (makers of the beautiful Clovis point arrowheads), the investigators (a University of Copenhagen group publishing in a recent issue of Nature) demonstrated that some 80% of all present day Native Americans are direct descendants of the individual’s closest relatives. Though not, alas, of the boy buried in Wilsall: he died shortly after his first birthday.

 

There’s always something wonderful about these genome-based historical studies. You feel connected, knowing that men and women walking by you on the street today carry genes from the family of a child buried over 12,000 years ago in Montana, or from ancestors who met each other when Homo sapiens met Homo neanderthalensis somewhere in the Levant. And that someday we too will be links in the great chain, passing on (if we are lucky in the genetic game of chance) our own genes to some unimaginable future.

 

In my next post I’ll write about what we’ve learned about cancer genomics in the last year. It may not be quite as much fun as Neanderthals, Globetrotter, and Clovis culture genomics, but it is consequential and interesting. Our own cells pass on things as well: individual microevolution as opposed to global macroevolution.


Tuesday, February 18, 2014

My favorite movie of 2013 will not win an Oscar. Entitled “A Boy and His Atom,” it is directed and produced by IBM scientists, and is a totally charming short celebrating nanotechnology. You actually view individual atoms being moved around. Not big on plot, but delightful nevertheless, it can be seen on YouTube at: http://www.youtube.com/watch?v=oSCX78-8-q0

 

Nanotechnology is one of those unfulfilled promises, made years ago, that is just so compelling that we keep giving it second (and third, and fourth, and fifth) chances. It has to come true at some point. Just what that point is, and just how it will manifest itself, is uncertain, but virtually all agree that something will come of it.

 

What do we even mean by nanotechnology? A few years ago albumen-bound paclitaxel was introduced to the world as the first nanotechnology drug. If slapping some egg white on Taxol is nanotechnology, then I’m the King of Siam: really quite ridiculous, but “Abraxane is nano” has had an astonishingly long run. “Doxil is nano” as well: fat droplets = nanotechnology. I’m not convinced that putting a drug in a different wrapper really qualifies, but what do I know?

 

Part of the problem is definitional. The Google definition (“the branch of technology that deals with dimensions and tolerances of less than 100 nanometers, esp. the manipulation of individual atoms and molecules.”) is fine, but basically only tells you that really small stuff is nanotech. What does “manipulation” mean? My fingers aren’t 100 nanometers.

 

I think of nanotech in the way that Nobel laureate Richard Feynman did. His foundational talk, titled “There’s Plenty of Room at the Bottom,” was presented in 1959, and still delights. In it he speaks of “a very wild idea, it would be interesting in surgery if you could swallow the surgeon. You put the mechanical surgeon inside the blood vessel and it goes into the heart and ‘looks’ around. (Of course the information has to be fed out.) It finds out which valve is the faulty one and takes a little knife and slices it out. Other small machines might be permanently incorporated in the body to assist some inadequately-functioning organ.”

 

Feynman went even further: “But I am not afraid to consider the final question as to whether, ultimately – in the great future – we can arrange the atoms the way we want; the very atoms, all the way down!” The great future was well within the lifetime of some in his audience, witness to the escalating speed of technology. Witness “A Boy and His Atom.”

 

That’s what I think of when I think of nanotechnology: really little machines. “Swallowing the surgeon” eventually became the concept of nanobots injected to fix molecular defects. The futurist Ray Kurzwell speaks of an era (he predicts 2030) when we are loaded with billions of these tiny machines, which communicate with the Cloud, and are endowed with what he calls “a measure of intelligence.” They will put medical oncologists (and many other specialists) out of business through preventative maintenance. In theory, I suppose, I may live long enough to benefit from these nanobots while still drawing on my pre-nanobot medical pension. Sweet!

 

Kurzweil takes something like 150 nutritional supplements per day in hopes of delaying his death long enough for the nanos to render him immortal, which suggests he either knows a great deal more or a great deal less than I do about nutritional supplements. He recently went to work at Google, which is beginning to get into the immortality biz. You can see Kurzweil interviewed by the Wall Street Journal here: http://blogs.wsj.com/cio/2014/02/04/googles-ray-kurzweil-envisions-new-era-of-search/

 

The best way to know that something is real in medicine, rather than some pipe dream, is when there’s a medical journal devoted to it. And, of course there is: it’s called Cancer Nanotechnology. I went and looked at its table of contents. The first article I espied was “Pharmacokinetics and biodistribution of negatively charged pectin nanoparticles encapsulating paclitaxel.” You know pectin: the agent you add to jam or jelly to help them thicken. Pectin. Really? Nanotechnology? Maybe I am the King of Siam.

After medical journals, your next best bet in seeing whether something is real is to measure government response. The NCI’s Nanotechnology Characterization Laboratory performs nanomaterial safety and toxicity testing in vitro and in vivo. To date it has evaluated over 250 potential nanomedicines, several of which are making their way to the clinic.

 

One likes to think of nanotech as essentially benign. But the long-term safety of nanomedicines is one of the great imponderables: there just haven’t been any studies. When I was an intern, a cardiology fellow shared some folk wisdom with my team: “If you haven’t had any complications, you haven’t done enough procedures.” True then, true now. My bet is that we’ll discover some new side effects.

 

And it’s not just nanotech side effects at the individual level that technologists worry about. Consider the “Gray Goo” scenario much beloved by nanotech Cassandras and science fiction authors. In the Gray Goo scenario, originally described by the nanotechnologist Eric Drexler, self-replicating von Neumann machines consume all of the matter on Earth while dividing uncontrollably. All that’s left of us and everything else is a large ball of Gray Goo. Yuck.

 

Realistic? Maybe yes, maybe no. If it is, get me out of here: some idiot dictator will release it by intent, or some lab tech by mistake. The Feds certainly take it seriously: the Department of Defense had a $426.1 million nanotechnology budget in 2012. The nanotech budget was less in 2013, but that may be sequestration.

 

The National Nanotechnology Initiative website says the “DOD considers nanotechnology to have potential to contribute to the warfighting capabilities of the nation… The DOD also invests in nanotechnology for advanced energetic materials, photocatalytic coatings, active microelectronic devices, and a wide array of other promising technologies.” I don’t worry about photocatalytic coatings (Pectin, anyone? Egg white on your drone?) But “active microelectronic devices” and  “other promising technologies” are probably what we need to look out for.

 

Back to medicine. One of the cooler new technologies, just firing up but already attracting attention, are ADARs (adenosine deaminases acting on RNA). ADARs allow one to edit RNA transcripts, correcting mutations at the transcript level. A recent paper (Montiel-Gonzalez et al. PNAS 2013; 110: 18285-90) showed the potential of this technology to correct a mutation causing cystic fibrosis. It’s early days yet (cell line works, not yet at the organism level), but a star-struck molecular biologist writing in the New England Journal of Medicine referred to the approach as “an RNA-repairing nanomachine within the limits of current technology.”

 

So maybe we’re not all that far off from the day when nanotech will correct all our inherited defects. I certainly have my share, and I’ve accumulated a few others along the way. Ultimately, of course, what constitutes an inherited defect will represent a novel problem of definition. Is your nose too small? There’s an ADAR for that!  (Note to my readers: nose = a family-friendly euphemism). I see a thriving industry targeting professional athletes and Hollywood actors, and soon all the rest of us. We’ll all look like Brad Pitt and Angelina Jolie, all run like Jesse Owens, and all live forever.

 

Yeah, right. And egg white and pectin are nanotech.


Wednesday, January 01, 2014

There is something about the end of a year that renders one more thoughtful, makes one look back rather than forward. We see it in all the media. For every article about the coming year, there are ten about what we just experienced:  the year’s best movies, the year’s best books, the year’s best (I kid thee not) reality TV series, the year’s ten most important news stories or sports events or whatever.

 

You get the idea. It is easier to review than to predict, and less embarrassing. Hindsight is 20/20. We are also reminded which important people died during the last year, or at least some person’s idea of who was important. Whenever I read of these, I often say to myself  “I didn’t know he was still alive to die” or “who?” or “darn, no more novels from her.” 

 

This year’s consensus big departure was Nelson Mandela. Not a bad choice, I would say, though my own thoughts were captured by a number of other passings, some reasonably well known, some not. Let me share them with you:

 

1.  Voyager leaves the solar system. This actually happened on August 15, 2012, though NASA didn’t announce it until September of 2013.  I have grown old with Voyager, which was launched the year I graduated from medical school. It is the first man-made object (other than radio waves) to leave the solar system. That it is still transmitting us data, decades after launch and some 18.2 billion kilometers away, is wonderful. In 40,000 years, it will come within shouting distance (1.6 light-years) of a star called AC+79 3888 in the Camelopardalis Constellation.

 

2.  Carbon dioxide levels pass 400 ppm. Unlike Voyager, which in penetrating the heliosphere surrounding the solar system passed a real boundary, this boundary is artificial. There is nothing magical about it, any more than a highway sign that says “37 miles to Chicago.” But global warming is real, and our inability to address it as a people (both inside and outside the United States) will be seen by future generations as an act of breathtaking irresponsibility. That a significant proportion of the population, and one of our two great political parties, reject the scientific reality of global warming is truly depressing.

 

3.  Palo Alto outlaws dying. Well, not really, but something like it. In moving to Stanford I have had the opportunity to see Silicon Valley close up, and what an interesting sight it is. The local rag (the Daily Post) had a wonderful story late in October on the closure of the Roller & Hapgood & Tinney funeral home -- Palo Alto’s last -- after 114 years of business. “The property value in Palo Alto is so great it can no longer justify use as a funeral home,” said the funeral home’s president. The property was sold to Yahoo CEO Marissa Mayer, who is rumored to want the land for residential development. There, in just a few sentences, is all you need to know about the Silicon Valley economy.

 

And, perhaps, our larger economy as well. Capitalism’s acts of creative destruction, fueled by scientific discovery, have been the great engine of productivity and growth since the 19th century takeoff, and certainly have made possible the lifting of hundreds of millions from poverty around the globe. But does it have any limits, any real boundaries? And will its current iteration, the Internet economy, enrich any more than the few, at the expense of vital (or post-vital) services for the many? Will it really be possible to die in Palo Alto in the future?

 

4.  The death of Gloria Bush.  The answer is yes. I thought the funeral home story was funny, until a subsequent article in the Daily Post sobered me right up. A 72-year-old homeless woman was found dead in Palo Alto’s Heritage Park. Bush, who was mentally ill, slept on the stairs of local churches and ate at local food pantry programs. A Palo Alto couple saw her wrapping her feet in plastic the night before her body was found. Her daughter had contacted local authorities, concerned that Gloria would stay outside during a recent cold snap, but they could not move her in against her will.

 

The world is full of small tragedies. Our care of the mentally ill is shameful, as any who have given the problem a minute’s thought must know. You’ve never met Gloria Bush, and if I ever saw her I’m sure I ignored her, as one ignores embarrassing crazy people one passes on the street. But please take a moment to remember Gloria Bush.

 

The story in the Daily Post did not say where she would be buried.

 

5.  Science passings: Len Herzenberg. There is some chance you have heard of Stanford’s Len Herzenberg, who died at the age of 81 on October 27. Herzenberg, along with his wife Lee, created the first flow cytometry machine, which morphed into the modern FACS (fluorescence activated cell sorter) used in research and clinical laboratories around the globe.

 

The story is that one day Herzenberg was looking through a microscope, counting fluorescent cells until his eyes hurt. He wondered if there might be a better way, thinking, “There’s got to be some kind of a machine that could do this.” With the help of Stanford engineers, he modified a machine developed at the Los Alamos National Labs, creating what they called “The Whizzer,” the parent to all modern FACS machines. Their first publication was in 1969, the first true FACS following in 1971.

 

We don’t think much about where our scientific machinery comes from, nor do the makers of such machines win the Nobel Prizes: the closest the Herzenbergs came was the Kyoto Prize in 2006. Stockholm doesn’t always get it right. One could easily argue that the impact of the FACS on modern biology and medicine was as great as just about any other biologic breakthrough one can point to in recent decades. Try to imagine AIDS or leukemia research (or treatment) without the FACS machine.

 

I’m reminded of one of my favorite Francis Bacon quotes, from the Novum Organum:  Neither the naked hand nor the understanding left to itself can effect much. It is by instruments and helps that the work is done, which are as much wanted for the understanding as for the hand.”

 

Herzenberg was a famously happy man, loved by all who knew him: the Stanford obit refers to his “constitutive smiling,” a very biologic description for a great biologist.

 

6.  Scientific Passings: Janet Rowley. This is a name known to many in the oncology profession, though certainly not to all. Janet Rowley was a giant in the field of cancer genetics. I heard her speak on several occasions, and once met with her for an hour at the University of Chicago. She was gracious, polite, and razor-sharp, acting highly interested in a not very interesting visitor.

 

Rowley attended medical school at the University of Chicago. The first year she applied, she could not get in: the women’s quota of 3 (in a class of 65) was filled, so she waited until the next year. Almost unimaginable today, but that was how it was done in 1944. For many years after graduation, she worked 3 days a week, while raising her family.

 

One day in 1972, while spreading chromosomal photomicrographs of Giemsa-stained cells on the family dinner table, she realized that CML was a disease of chromosomal rearrangement, with a translocation of chromosome 9. It was a true paradigm shift (a widely overused phrase, but appropriate for Rowley’s epiphany), the first of many chromosomal translocations implicated in carcinogenesis, and a crucial step on the road to drugs such as imatinib and crizotinib. 

 

She was showered with awards, all well deserved, but like Herzenberg her real achievement was the many people alive today because of her work. I doubt we will see her name on many year’s end lists, and the nature of scientific achievement is that it is anonymous to the many and eventually forgotten even by the few who should remember.

 

So let me end with another quote, the beautiful ending to George Eliot’s Middlemarch:

 

“The effect of her being on those around her was incalculably diffusive: for the growing good of the world is partly dependent on unhistoric acts; and that things are not so ill with you and me as they might have been, is half owing to the number who lived faithfully a hidden life, and rest in unvisited tombs.”

 

Janet Rowley MD died of ovarian cancer at the age of 88. Until the last few months of her life, she continued to bicycle to work.


Sunday, September 22, 2013

There’s been a great deal of talk lately about the price of cancer drugs: Op-Eds and articles in the New York Times, a slew of interviews of prominent oncologists in every imaginable media outlet, and impassioned editorials in leading medical journals.

 

The sense many of us (myself included) have is that the current system of drug pricing is ultimately unsustainable. How many $10,000-a-month drugs can a health care system sustain before collapsing? We’ll discover the answer in the near future. The prices are already well above what resides in the average uninsured American’s bank account.

 

Outside health care, the introduction of a me-too product (a new smartphone or computer tablet) results in a significant reduction in price, a race to the bottom. In PharmaLand, though, a new imatinib clone for CML results in an increase in price that far outstrips added benefit, accompanied by price increases for older drugs. Drug costs seem effectively outside the modern capitalist economy.

 

The economics of drug development and pricing are complex, and I do not claim to understand it, though I have tried. Bottom line: prices for on-patent drugs always increase.

 

One of my favorite Tolstoy stories is the wonderful “How Much Land Does a Man Need?” Its protagonist, the landowner Pahom, is never satisfied with the amount of land he owns. He moves from place to place in his pursuit of more farmland, finally leaving the settled areas for the wild places at the edge of the Tsarist Empire. There he meets a tribal chieftain who tells him that, for a thousand rubles, he can have as much land as he can circumnavigate on foot in a day.

 

After walking all day he sees the sun setting, and races back to where he began his walk, only to collapse and die at the finish line. The story ends: “His servant picked up the spade and dug a grave long enough for Pahom to lie in, and buried him in it. Six feet from his head to his heels was all he needed.”

 

Indeed. Is the pharmaceutical industry racing, Pahom-like, to an early grave, in its pursuit of wealth and new compounds? And how much “land” does it really need to innovate, to create the next generation of drugs, while satisfying its stockholders? Again, I don’t claim to know the answers. I only know that this can’t go on forever.

 

In Tolstoy’s Russia a man’s worth was based on the amount of land he owned. Today wealth has a different measure, something incomprehensible to our forbearers: insubstantial collections of photons hiding out in distant server farms, mirages we have collectively agreed to call “money.”

 

And that wealth, in turn, is capable of buying life, at least in small increments. Debates over drug costs are interesting because they revolve around value judgments. Consider this thought experiment: imagine someone hands you a glass of water and says, “This will cost you a dollar, has no side effects, will palliate your disease-related symptoms, and will prolong your life by 12 days” (the EGFR/pancreatic cancer scenario). Pretty much everyone would drink deeply. We value life when the price is low.

 

But the other knock on our new drugs is that while we pay an immense amount, we often get little in return other than side effects. EGFR inhibition for pancreatic cancer, with its 12 added days of life, is free neither of cost nor of toxicity. The old Woody Allen joke applies: two old ladies are talking about a restaurant they’ve just eaten at. The first woman says, “The food, the food was so awful.” Yes,” says the second woman, “and the portions, the portions were so small.”

 

And the further one gets away from the magical “glass of water” scenario, the fewer willing to drink the Kool-Aid. Where one draws the line is always a value judgment. Always. And it leads to the bitterest of arguments: remember “Death Panels” in the run-up to the Affordable Care Act? We are, as a society, deeply uncomfortable with telling taxpayers that their lives lack infinite value.

 

QUALYs

Cost, life added, toxicity. The health outcomes folks have wrestled with these three for years, with their Cost per Quality-Adjusted Life Years (or QUALYs, as they are know in the biz). QUALYs are an attempt to provide some quantification to an inherently unquantifiable ethical quandary. $50,000 per QUALY? $75,000? $100,000? $250,000? Are our QUALYs inflation-adjusted?

 

The Cost per QUALY equation basically asks us “How much is a life worth?” Patients, of course, are also interested in the related question, one little studied by health economists: “How much is my life worth living?”

 

My patients have given me many different answers to that question. Some are serene about the length (and end) of life, others terrified by onrushing oblivion, and some concerned about the fallout their passing will create.  Sometimes their feelings overlap with religious beliefs, sometimes with age, sometimes with perceived responsibility. I’ve been a doctor long enough not to generalize about how patients answer the question, or even about whether there is a right answer. Three brief vignettes will suffice:

 

1.   An elderly breast cancer patient had indolent metastatic breast cancer, which routinely shrunk with endocrine therapy whenever I could convince her to take it. But she was sad -- sad because most everyone she had cared about was dead, and her children were neglectful and far away, so she took the pills only irregularly. One day, as she left the clinic, I said “See you in three months.” With a withering look directed at me, she said, “God, I hope not.”

 

2.   A woman in her 40s found that her husband was sexually abusing her son. She couldn’t convince the authorities to prosecute; not enough evidence to convict. She didn’t want to expose her child to the vagaries of the legal system, so she cut a deal with the monster, divorced, and got sole custody. She came down with a nasty, ugly, estrogen receptor negative breast cancer. It was the late 1980s, and I didn’t have much to offer her, but she would do anything to add even a day of life, so I beat her up with toxic and ineffective regimens. She looked exhausted the last time I saw her, but was horrified when I told her I had nothing else to offer. She wasn’t afraid of dying. She was terrified that at her passing her child would fall, once again, into the evil hands of her ex. I don’t have a clue what happened to that child: I still think about him from time to time.

 

3.   An old West Texas small-town city councilman developed non-small cell lung cancer and came to the VA for care. I was a brand-new oncology fellow, and I offered him on an investigational chemotherapy trial. His tumor responded, quite nicely, for six months or so, but eventually progressed. Back then, like many novice physicians, I felt heroic when my drugs worked, and a failure when they didn’t. I apologized profusely for the progressive disease we saw on his chest x-ray.

      He waved my explanations off: he was one of those stoics the high plains produce in quantity, and uninterested in my apologies. In fact, he ended up consoling me. “Let me tell you something, Dr. Sledge, about those six months you gave me. In my town there’s a poor section [he didn’t say it, but it was understood that he was talking about the Mexican-American barrio] that didn’t have running water. No one gave a damn. I always felt bad about that, but I went along with the rest for years. I knew I didn’t have much time left, so I rammed a water bill through the city council. Those folks will have running water after I’m gone.” He didn’t smile at all when he said this, but had a look of grim satisfaction on his face, the sort one see when someone has righted a deep and longstanding wrong.

 

We must base public policy on health economics, not on anecdotes, of course. The drugs are obscenely expensive, they don’t do enough, and they often harm quality of life at the end of life. They are used long past any reasonable chance of benefit. Resources are limited, and the decision to use expensive and often ineffectual drugs draws money away from other valid social and personal goals (spend all your money on co-pays for the latest RTKi, and forget sending Johnny to college next year: you’re broke). Drug pricing also distorts oncology practice, a practice bathed in perverse incentives.

 

Yet  “how much is a life worth?” the Cost/QUALY equation, still leaves me a bit queasy. Queasy, because I don’t have a good answer to the question “how much is a life worth?” and because my patients have so many, and equally valid, answers to the question “how much is my life worth living?”

 

Will a person’s worth ultimately be based on the amount of life he can afford? A fascinating new book (The Book of Immortality, by Adam Gollner) describes how a group of billionaires are funding anti-aging research. We oncologists squabble over the meaning and cost of a few months of life, often earned with appalling toxicity. But these guys don’t care about cost (the server farms are full of their ghostly wealth-photons), nor do they care about a few crappy months: they are shooting for immortality, or something like it. The Big Time, literally.

 

The book mentions Larry Ellison of Oracle, one of our richest software moguls, whose biographer says that he sees death as “just another kind of corporate opponent he can outfox.” Not for Ellison the Bill Gates approach to philanthropy, buying mosquito netting and creating vaccines to prevent the deaths of real people in Sub-Saharan Africa. His Ellison Medical Foundation gives out more than $40 million per year for research dedicated to ending mortality, by “understanding lifespan development processes and age-related diseases and disabilities.” OK?

 

If a drug tripled or quadrupled the normal human life span (forget immortality: there are just too many asteroids, lightning strikes, paranoids with assault rifles and drunk divers out there) what would you pay for it? A lot, I suspect. Who would be allowed to use it? My bet would be on the billionaires, and the devil take the hindmost.

 

Science fiction is chock full of stories exploring this theme, but popular culture has a word for wealthy immortals who care nothing for those lesser beings they control even as they feed off them. They are called vampires.

 


Monday, August 19, 2013

There was a time, in my innocent youth, when the follow-up note for the average cancer patient (usually scrawled in quasi-legible doctorese) looked something like this:

 

S:  No new complaints.

O:  Px-unchanged. Responding on CXR. Labs OK.

A:  Doing well

P:  Continue Rx. RTC 3 w.

 

I thought of this recently when, preparatory to starting my new clinic at Stanford, I took Epic training. I had moved from a healthcare system that used Cerner, and of course when you know one electronic health record system, you know one electronic healthcare system. After spending some time getting access (yet another set of passwords and another user name), and getting "tokens" for my iPhone, my iPad, and my Mac, I spent an hour or so with a lovely lady assigned by the hospital to explain the new system to me, who then handed me a helpful booklet that would remind me when I inevitably forgot the lesson. Which I did, almost immediately.

 

Epic Systems is headquartered in Verona, Wisconsin, a sleepy town on the outskirts of Madison. My dad used to take me there for haircuts when I was a teenager, and so it is fitting that Verona continues to provide me haircuts via Epic. The cuts are now to my time, and my emotional comfort. I’m hoping both will improve as I become more at ease with the system, though I have yet to meet anyone (including hospital administrators) who thinks it will speed me up.

 

Epic is based on 1960’s MUMPS software developed at Massachusetts General Hospital. Think about that: a software system that pre-dates MS-DOS. It is (and this is not particular to Epic) loaded with excess clicks, small fonts, long scrollable lists of diagnoses in no particular order, and buried data sets. The company was founded by a former employee of the University of Wisconsin’s Psychiatry department, the sort of fact that is almost too delicious: are we all part of some devious, extreme stress-inducing psychological experiment designed to increase the business of the psychiatrists? Well, that would just be paranoid, wouldn’t it? Really, I’m not crazy. Really.

 

Comparing 1983 with 2013, the 4 lines of meaning mentioned above are buried somewhere in pages of cloned busywork stored somewhere on a server and accessed via a clinic workstation. Has the current electronic health record made us better healthcare providers, or more efficient at getting through the day? That these are still arguable propositions (and they are regularly argued, at least in the precincts I hang out in) says a great deal about our tendency to adopt new technologies without putting them to the test.

 

Modern healthcare record keeping dates back to the 1920’s, when the American College of Surgeons created the Association of Record Librarians of North America to “elevate the standards of clinical records in hospitals and other medical institutions.” ARLNA still exists, transmogrified to the American Health Information Management Association (motto: Quality Healthcare through Quality Information), with more that 64,000 members. But the records it shepherds are no longer primarily under the control of physicians, nor written on paper. Like the rest of modern society, they have gone digital.

 

What has driven the digitization of the current medical record is fairly straightforward. First, the need to document, for the benefit of the payers, that we are doing what we say we are doing. This is a function of the American healthcare system's radically dysfunctional payment scheme, with its a la carte menu approach, propagated by government but equally embraced by private insurers.

 

Second was the vision of the electronic health record as sovereign cure for our all our woes. This is related to, and at least partially dependent on, the documentation mandate, but was clearly something more. Back in 2005 the RAND corporation, one of our premiere think tanks, predicted that the rapid adoption of EHR technology would save the U.S. healthcare system more than $81 billion annually through improved efficiency. The number was not plucked out of the air: it was a reasonable extrapolation based on information technology’s effects on other aspects of the American economy.

 

Politicians -- both Republicans and Democrats -- were quick to latch on to the latest promise to reduce healthcare costs. George Bush and Barack Obama considered it essential for American physicians to adopt EHR.

 

Earlier this year Arthur Kellerman and Spencer Jones of the RAND Corporation provided an update in Health Affairs. To summarize the healthcare savings benefits achieved through EHR adoption: NOT! Why not? “The disappointing performance of health IT to date can be largely attributed to several factors: sluggish adoption of health IT systems, coupled with the choice of systems that are neither interoperable nor easy to use; and the failure of health care providers and institutions to reengineer care processes to reap the full benefits of health IT.”

 

The cost savings were illusory in part because, in contrast to other industries, EHRs are used by hospitals to help game the system, increased documentation leading to higher levels of billing. So while EHR systems are expensive to purchase and maintain, the return on investment for a hospital corporation can be impressive.

 

I can certainly speak to the “not easy to use” part. There is nothing easy or intuitive about EPIC or Cerner or a their kindred: they look like a Boeing 747 control panel. Actually, let me take that back: I cannot imagine that they are designed like a Boeing 747 control panel, because if they were, then there would be far more fatal airplane crashes than actually occur.

 

If EHR adoption has not improved physician efficiency or reduced healthcare costs, who has benefitted? EHR vendors lead the list, of course. The big EHR firms (Epic, Cerner, Allscripts) have all seen explosive growth in recent years. This growth is driven by federal government incentives (the recession stimulus package) and federal mandates (the 2015 deadline for meaningful use of EHRs). Both carrot and stick were the result of heavy lobbying efforts by EHR companies, which like the pharmaceutical companies are now an integral part of the Washington “you scratch my back, and I’ll scratch yours” reciprocal favor system. Allscripts’ CEO visited the White House on seven occasions after President Obama took office in 2009, and personally made more that $225,000 in political contributions.

 

Epic saw its sales double in just four years to $1.2 billion in 2011. Cerner has revenues of $2.2 billion, and Allscripts comes in somewhere in the same range as Epic. EHR vendors never suffered during the economic downturn. These guys have done quite well for themselves, if not always for us.

 

The basic premise behind EHRs, and the reason no one wants to give up on them quite yet, is still sound: having all the data instantly available at your fingertips should make patient care safer and more effective. If this has not yet happened, it is in part related to lack of interoperability between the EHR systems. Crossing town from one practice or hospital to another is frequently to enter a deep electronic chasm. The best you can hope for is that you live in a place where monopoly prevails. Epic (and I really don’t mean to pick on Epic) has the EHR franchise for Stanford, UCSF, and Kaiser Permanente, among other local concerns, so the Bay area is relatively integrated from an EHR standpoint.

 

But this is not the case everywhere. And without interoperability (which EHR vendors pretty much all passively or actively oppose) the promise of EHRs will continue to be largely theoretical. Gandhi’s famous response when asked what he thought about Western civilization comes to mind: “It would be a good idea.”

 

ASCO has devoted significant resources (significant for ASCO if trivial for the EHR vendors) to creating CancerLinQ, its rapid learning healthcare system. My sense, in my brief experience with Epic and my older relationship with Cerner, is that the EHR vendors have not thought through the needs of specialties, perhaps because their principal relationship is to hospital corporations.

 

Electronic health records are an important component of any rapid learning system, perhaps the central component. CancerLinQ will ultimately need to interact with a whole slew of EHR vendors. Will it be embraced, or strangled? Or just ignored?  We’ll see. In the meantime, I will continue to plod through my dictations.

About the Author

George W. Sledge, Jr., MD
GEORGE W. SLEDGE, JR., MD, is Chief of Oncology at Stanford University. His OT writing was recognized with an APEX Award for Publication Excellence in the category of “Regular Departments & Columns.”