Musings of a Cancer Doctor
Wide-ranging views and perspective from George W. Sledge, Jr., MD

Monday, April 10, 2017

Usually it happens in some committee meeting, or an all-day conference. You've been there, I'm sure. A speaker gets up, and starts talking about a thoroughly uninteresting subject. Or maybe an interesting subject that he makes thoroughly uninteresting. He drones on, and on, and on, and at some point you cannot take it anymore. You are bored, painfully so, and you want to get up and run screaming out of the room, but of course you were bred for courtesy in the face of monotony. You drift into something approaching catatonia, eventually relieved by the shuffling of papers and the movement of chairs as those around you break for lunch. Life is short, and ennui has claimed more of your precious remaining hours.

What I find fascinating is that the concept of boredom is so new. The word did not exist in English until Charles Dickens published Bleak House (a novel where lawsuits drone on and on) in 1852. He must have loved the neologism, for he used the word six times. It caught on. But most languages now have a word for boredom: in French, ennui; in German, langeweile; in Dutch, verveling. I am sure boredom existed before 1852, just like the world had color before the invention of the color TV, but why no word for the concept? Did it require the repetitive motions of the Industrial Age, that 19th century phenomenon? Was the world more interesting then? The past is a foreign country; they do things differently there, as L.P. Hartley wrote.

Boredom has a (small, pretty boring) medical literature. There is, for instance, a Boredom Proneness Scale (BPS), which I refuse to bore you with, which allows one to quantitate your degree of boredom. The BPS is, I read with no particular enthusiasm, in the process of being replaced by the Multidimensional State Boredom Scale. You might think that confining boredom to just one dimension would be a good thing, but apparently not.

Defining boredom is problematic, though psychologists have tried. The best scientific definition comes down to something like "an unfulfilled desire for satisfying action." Think of a caged lion, pacing back and forth for hours. Psychologists have defined four types of boredom: indifferent, calibrating, searching, and reactant. A recent advance in the field was to add a fifth type, apathetic boredom. Sometimes the researchers' ability to descend into self-parody is impressive: Can we make boredom studies more boring? Answer: yes, let's add apathetic boredom to the list.

Boredom researchers hold an annual conference at the University of Warsaw. I'll keep going to ASCO, thank you very much. There have been remarkably few boredom proneness studies inflicted on cancer patients. For what it's worth, treating terminal bored (terminally bored?) patients with the antidepressant citalopram makes them measurably less bored.

Boredom has been studied in several neurologic and psychologic disorders. Patients with anterograde amnesia, of the sort experienced by Drew Barrymore's character in the movie 50 First Dates, rarely get bored. Everything is always new. Alzheimer's disease, surprisingly, appears associated with staggering amounts of boredom, at least in the early stage of the disease. Boredom is associated with (in no particular order) drug addiction, compulsive gambling, eating disorders, depression, alcoholism, ADHD, and poor grades. Indeed, a drug addict's ability to beat his addiction appears related to his degree of boredom. Traumatic brain injury patients are easily bored. Auto accidents are more common among the bored, which of course leads to brain injury.

Speaking of injury, boredom is particularly a problem in the workplace. While some souls prefer mindless repetitive activity, most of us are not wired that way. Faced with the prospect of placing widget A in hole B for 8 straight hours, an assembly line worker may drift off, with dangerous medical consequences: boredom is a major underlying cause of workplace injury. I don't see this happening often in the medical clinic, though a colleague started snoring in the midst of a particularly boring patient interaction. I'm sure this happens all the time to analysts.

Boredom has been studied, after a fashion, in animals. I don't need an animal psychologist to understand that dogs get bored, as any time spent around a sad-looking basset hound will attest. I know it is unwise to project human emotions onto our canine friends, but dogs are so finely attuned to human emotions that it would be surprising if they did not recognize, and mirror, our boredom. I've seen dogs start yawning with their masters, and even vice versa. Cats go ka-ka if not continuously stimulated by their human servants. There's always a danger in anthropomorphizing animal behavior, of course, but pets and farm animals sure act bored on a regular basis.

From an evolutionary standpoint, boredom seems to require some relatively high level of intellectual functioning. We cannot interrogate Fido with a Boredom Proneness Scale, so there's a paucity of formal animal boredom studies. Researchers at the University of Guelph in Ontario published one interesting experiment in which they separated two groups of captive mink into either "enriched" or bare, sparse, boring cages. They then approached both groups with new stimuli, including ones that mink normally find scary. The mink from the unenriched cages were quicker to approach new objects, even the scary ones. Objectively speaking, they get bored and seek stimulation.

Boredom clearly has a developmental aspect. Anyone with kids recognizes the one or two-year-old's delight with the world, the absolute joy and sense of wonder they experience with every new thing. In contrast, teenagers are the world champions of boredom. Particularly around their parents and teachers, whom they seem to think were put on this earth to bore them to death. Older folks (well, my age) report lower levels of boredom. I'm so over that.

"Bored to death," by the way, is literally true. The British Whitehall II study asked civil servants questions about the degree of boredom they experienced. Two decades later, the most bored civil servants experienced the highest death rates. Bored people are meaner, too: if asked to recommend a prison sentence for a criminal, bored pseudo-jurors recommend harsher punishments: "Throw the book at him, your honor, I'm suffering from ennui." And people who are bored at the Department of Motor Vehicles are less likely to register as organ donors. On the other hand, since bored people are more likely to perform risky behaviors, they're more likely to become organ donors. Should transplant surgeons hold dance competitions at their local DMV? Or not? Public policy questions are always so tough.

Psychologists have started to scan the brains of the bored. They first have someone watch a video known to induce boredom, and then perform a functional MRI scan. The bored have specific changes in the default mode network area of the brain. To quote the 2016 paper in Experimental Brain Research, "Boredom represents a failure to engage executive control networks when faced with a monotonous task—in other words, when the task demands some level of engagement, but is so mundane that attempts to do so fail."

Nice to know that my boredom has physiologic correlates. Usually, when I am in the midst of a boring all-day meeting, my mind wanders, often so far afield that new ideas start popping into my head. I scribble down notes: to do lists, ideas for blogs, research projects, you name it. Boredom, I suppose, can be the starting place for creativity. There is, of course, a literature attached to this phenomenon. If you randomize a group either to a boring task (copying phone numbers out of a telephone book) followed by a creative task (finding uses for plastic cups), or to the creative task without any preceding period of boredom, the induced boredom group is more creative.

Anyways, my attention is wandering, so I think I'll stop writing and go see that latest Hollywood big budget blow- 'em-up extravaganza. You know, the one where the aliens train a boredom ray on the earth and we're saved by a two-year-old who happens to have gotten ahold of a nuclear missile.​


Tuesday, February 28, 2017

Looking through my files, I find that I gave 13 lectures in 2016, or just over one a month. Using that number as a rough annual average, this would imply somewhere over 400 lectures during the course of my career, though this could be off by 100 or so, plus or minus. Each of those lectures involved several hours of prep time, and some few involved tens of hours of preparation, so I've devoted a fair amount of my life to their creation and presentation.               

My colleagues who teach "real" medical school courses would scoff at these efforts, of course, but some of these talks were defining moments of my career, and therefore important to me, even if others barely noticed. Being an academic physician means many things, but one commonality involves flying somewhere, standing behind a lectern, and proceeding through a PowerPoint presentation.

In doing so, we represent part of a long chain that stretches back centuries. The very first medical school, the Schola Medica Salernitana, dates to sometime in the 10th century. While very little is known about its origins, at its peak between the 10th and 12th centuries the Schola Medica Salernitana was the center of learning for medicine in the Western world. Legend has it that the school was founded by a Greek, a Jew, a Latin, and a Muslim. Medical schools have always been cosmopolitan places, and Salerno, located on the Tyrrhenian Sea in the south of Italy, stood at a crossroads between civilizations.

Constantine Africanus, born in North Africa and a convert to Christianity, came to Salerno in 1077 and translated ancient Arab medical texts—themselves translations from even older Greek texts—into Latin, forming the basis of the curriculum in Salerno. In the image below, he is lecturing to 11th-century medical students.


Medical students often don't bother to attend lectures these days, due to the ubiquity of note-takers and the filming of talks. This is nothing new: some of our first manuscripts from Salerno represent notes taken by students and then sold on to others. Skipping class also has a long, if undocumented, lineage.

Some of the most boring hours of my life were spent sitting in lecture halls. There are few things worse on this earth than a turgid, remorselessly dull lecture. Hell, if it exists, is probably full of lecture halls. How much did you learn from years of lectures, compared to the number of hours spent listening to them? The return on investment of time is undoubtedly low, which is why those who study knowledge acquisition rarely have kind things to say about medical pedagogy.

But things could be worse. In fact, they were. Throughout much of the Middle Ages, anatomy classes involved three individuals: a lector, who sat on a pulpit and read from an anatomy text; an ostensor, who pointed to the body parts being dissected, and a sector, who did the actual dissections. The problem, aside from the boring bit about being read to from a Latin anatomy text, was that the text was wrong. Galen, the ancient Roman father of anatomy, had done his work on animals, not humans.

Andreas Vesalius of Padua, the first modern anatomist, was important not only because he dissected humans, but because he got rid of the tripartite lector-ostensor-sector thing, lecturing as he dissected. In the image below, he is surrounded by his students.


Well, they won't let me bring tumors to my talks. Instead, we use the moderm paraphenalia of lectures. The speaker stands behind the lectern (from the Latin lectus, past participle of legere, "to read"), which in turn is situated on a podium (from the Greek pod- for foot, since you are standing on it), a raised surface that allows the speaker to look down at the audience. The lectern serves several purposes: the speaker can place notes there, he can grip the lectern for support, and (if broad enough at the base) it can hide knocking knees of first-time speakers.

Lecterns have their own dangers. At one ASCO event, the lectern was placed near the back of the stage, for some reason. One speaker fell off the dais, saving the audience (statistically speaking) 15 minutes of boredom.

For those speakers who prefer not to hide behind the lectern, the conference organizers may offer the use of a lavalier mike, a small microphone clipped to the tie or coat, attached to a transmitter.

Why it is called a lavalier microphone? The term comes from the jewelry business, where a lavalier is a pendant suspended from a necklace. And why do jewelers call it a lavalier? In memory of the Duchesse Louise de la Valliere, mistress of King Louis XIV of France. She bore him five children and ended her life in a convent, but before that she made the eponymous jewelry popular. I like to think, as a regular user of the lavalier mike, that Louise did not take a vow of silence when she entered the convent. Or maybe she did, if she lectured the King using PowerPoint presentations.

I am dating myself, but I remember when slides were slides. Slide shows have their own history. The first slide shows were in the 1600s when hand-painted pictures on glass were projected onto a wall using a so-called "magic lantern". These, in turn, became the "lantern slides" used by 19th century speakers. By the early 20th century photographic images replaced painted images on glass, and then in 1936 the introduction of 35mm Kodachrome film led to the standard 2 X 2 inch slides that I used as an assistant professor.

These were, in contrast to current practice, "real" slide shows. I have difficulty making oncology fellows who grew up on PowerPoint understand what these involved. Today, speakers work on their talks right up until they walk on stage, but in my youth one prepared talks weeks in advance. You had to. One took carefully typed pieces of paper or pictures to the Medical Illustration department, where they were photographed and slides prepared, at an often-substantial price. Towards the end of the 1990s I gave a plenary lecture at ASCO that was projected on five different screens, requiring five sets of slides. The data came in late. I remember it costing $600.

These slide shows could be adventures. If the slide projector overheated the slides might warp or melt. Or, if the Kodak carousel top came loose, your entire talk might fall out. I saw this happen once. The speaker picked up the slides, placed them back in the slide carousel in no particular order, and gave his talk in no particular order. Hilarity ensued. Ah, those were the days.

My plenary session talk was virtually the last one I gave using "real" slides. Shortly thereafter PowerPoint presentations became standard, almost simultaneously with the internet. Now I drag images off the web (someone somewhere on this earth has already made the point I want to make), never visit Medical Illustration (if they still exist), and go on my happy way.

PowerPoint has its detractors. PowerPoint templates lead to the dreaded "Death by PowerPoint" talks with their unimaginative, sterile, repetitive slides that pollute Grand Rounds around the world. Microsoft has sold hundreds of millions of copies of PowerPoint software, which in turn have led to trillions of slides.

Was this what Constantine Africanus had in mind when he first got up to speak at the Schola Medica Salernitana? I don't know, but I suspect, given the rate of technologic change, that later 21st century lecturers will tell their junior colleagues that "when I was an Assistant Professor we used 'real' slide shows, something called PowerPoint."

How will they lecture? Some three-dimensional extravaganza? Direct porting of knowledge to embedded cerebral implants? Will lecture halls disappear entirely, turned into office space or new laboratories? Will all the talks be virtual, rather than in-person? But that's a talk for another time.


Tuesday, October 25, 2016

Every now and then I see an article trumpeting "the oldest" this or that—sometimes human, sometimes a non-human vertebrate, sometimes a tree. The "oldest human" competition is one you never actually want to win. It seems something akin to "longest stretch on death row": an achievement, no doubt, but not one that you would feel comfortable owning.

Still, these stories have their own charms. Recently, some Danish biologists anointed the Greenland shark as "the oldest living vertebrate." Greenland sharks, as their name suggests, hug the coast of Greenland in the North Atlantic and Arctic Ocean. Why they do not, like Canadians in January, migrate to the Florida Keys is a mystery. What science knew about them until quite recently was limited: they are quite long (up to six meters) and they grow at a glacial pace (a few centimeters at most per year) and swim quite slowly (1.22 km/h). Katie Ledecky would have no trouble swimming faster than your average Greenland shark.

In an attempt to get at the age of the Greenland shark, investigators carbon-14 dated a shark's eye lenses. Why the lens? Because Greenland shark lenses do not change after birth. To their very great astonishment, they came up with a life expectancy of 392 +/- 120 years. There may be Greenland sharks swimming in the Arctic that were born around the same time as William Shakespeare.

An interesting side note: Greenland sharks have high concentrations of trimethylamine oxide and urea in their flesh. The trimethylamine oxide (an adaptation to living in deep waters) is converted to trimethylamine when you eat a Greenland shark, and can make you quite drunk. Kæstur hákarl, an Icelandic delicacy made from the Greenland shark, smells like urine and has been described by Anthony Bordain as "the single worst, most disgusting and terrible tasting thing" he had ever eaten. Also, Greenland sharks don't reach sexual maturity until they are 150 years old. Or maybe they just can't stand the smell of each other.

So, to summarize: if you are a Greenland shark you live in the darkest, coldest water on the planet, you smell bad, you move very slowly, and you cannot get a date until midway through your second century. Why would you want to live four centuries? Because you can, apparently.

The World's Oldest

The "oldest living" list for animals includes: tortoise (250 years), lizard (117 years), and koi (226 years). Vertebrates don't match up all that well against invertebrates, alas: a clam named Ming lived 507 years, and would have lived longer except that, as a delightful USA Today headline stated, "Scientists accidentally kill world's oldest animal."

Note to old folks: stay away from them scientists.

Trees do even better than animals in "the oldest living" competition. The winner for "oldest tree" is a Norway spruce named "Old Tjikko" that lives in the harsh terrain of Sweden's Dalarna province. It clocks in at 9,550 years, though it is something of cheat for "world's oldest" in that it is a vegetatively cloned tree. The tree we see is relatively young (the tree trunk only lives for 600 years or so) and is a sprout off of the ancient root system. Old Tjikko is named after the dead dog of its geographer discoverer, not that it matters.

A Bosnian pine holds the European record for non-clonal trees at ~1,075 years old. The oldest non-clonal tree in the world is said to be a bristlecone pine tree from California's White Mountains named Methuselah that is around 5,000 years old. There was an even older one named Prometheus that (to quote a Mental Floss article on old trees) "was accidentally felled by a scientist who didn't realize the tree was as old as it was." I'm beginning to see a pattern here: per William Blake, "We murder to dissect."

Forever Young?

What all of these "world's oldest" seem to have in common is that they live in the world's barren places: mountains, deserts, the bottom of the Arctic Ocean. Not much grows there, and what grows does so at a leisurely pace. A Nature paper from 1999, entitled "Ancient stunted trees on Cliffs," found that vertical cliffs around the world "support populations of widely spaced trees that are exceptionally old, deformed and slow growing." These trees have radial growth of less than 1 mm per year.

Stunting your growth, then, is a good thing if you want to live forever. That, at least, seems to be the lesson from natural history studies. But it is also true at the mammalian end of the spectrum. Starve mice of calories and they live significantly longer.

Caloric restriction is probably a good idea for human as well, though it is unlikely to double one's survival, let alone allow a Methuselah-like existence. I've written, in previous blogs, about anti-senescence drugs (senolytics) and blood transfusions from the young as ways of warding off death. The latter derives from experiments in which a young mouse and a senescent mouse are sewn together, a process known as parabiosis, with resultant de-aging of the old guy. Now, it seems, such approaches are becoming popular among the wealthy.

Peter Thiel, a Silicon Valley tech investor best known for speaking at Donald Trump's Republican National Convention earlier this year, is an investor in Ambrosia. Ambrosia is a Monterey, Calif., company that has started a trial called "Young Donor Plasma Transfusion and Age-Related Biomarkers." As trials go, it is unusual: you pay the company $8,000 to participate. For anyone between the ages of 35 and 80, participation involves receiving the blood of plasma donors ages 16 to 25.

I looked up Ambrosia's trial (NCT02803554, in case you are interested) on ClinicalTrials.gov. The trial design is pretty simple: draw some blood, infuse plasma, draw some more blood a month later.

Lots of blood. The blood drawn is examined for biomarkers of aging, including—and this is only about a third of what is being tested—Interleukin-5, Interleukin-6, Interleukin-7, Interleukin-8, Leptin, Macrophage Inflammatory Protein-1 alpha, Macrophage Inflammatory Protein-1 beta, Macrophage-Derived Chemokine, Matrix Metalloproteinase-2, Matrix Metalloproteinase-3, Matrix Metalloproteinase-9, Monocyte Chemotactic Protein 1, Myeloperoxidase, Myoglobin, Neuron-Specific Enolase, Plasminogen Activator Inhibitor 1, Prostate-Specific Antigen, Free, Pulmonary and Activation-Regulated Chemokine, Serum Amyloid P-Component, Stem Cell Factor, T-Cell-Specific Protein RANTES, Thrombospondin-1, Thyroid-Stimulating Hormone, Thyroxine-Binding Globulin, Tissue Inhibitor of Metalloproteinases 1, Transthyretin, Tumor Necrosis Factor alpha, Tumor Necrosis Factor beta, Tumor necrosis factor receptor 2, Vascular Cell Adhesion Molecule-1, Endothelial Growth Factor, Vitamin D-Binding Protein, von Willebrand Factor.

You pretty much need the plasma infusion to survive the bloodletting. The clinical trialist in me says that the next time I drive to Monterey it will be to visit the aquarium. And would shooting up with a 20 year-old's plasma really be all that useful? What if the plasma you needed was from a 6-month-old? Ambrosia is looking to recruit 600 patients: we'll see how they do.

Ambrosia, you will remember, was the nectar of the gods in ancient Greece. Our modern gods appear terrified by the prospect that their billions will not suffice to render them similarly immortal, hence their interest in anti-aging technology. Meanwhile, another Silicon Valley company, Alkahest, is performing the PLASMA (PLasma for Alzheimer SymptoM Amelioration) Study with neurologists at Stanford. According to ClinicalTrials.gov, this one has more modest aims, being primarily a feasibility study with a few dementia metrics thrown in as secondary endpoints. Let's hope it works. I'm burning through cortical neurons at an alarming rate.

Anyways, modern science is now in hot pursuit of anti-aging strategies. Whether the pursuit of Greenland sharkdom or Ming the Clam or Methuselah the bristlecone pine is worthwhile I will leave to your philosophical imagination. I tend to side with Gilgamesh of Uruk on this:​

What you seek you shall never find.

F​or when the Gods made man,

They kept immortality to themselves.

Fill your belly.

Day and night make merry.

Let Days be full of joy.

Love the child who holds your hand.

Let your wife delight in your embrace.

For these alone are the concerns of man.


Monday, September 12, 2016

I offer, for the edification of my readers, a few thoughts I've had about cancer research, garnered from 3-plus decades in the lab and the clinic. I cannot claim originality, as some of these are flat out steals, and I suspect the rest probably have been thought of by others. But I offer them in the hope that they may be of some interest.

1. Ideas are cheap. Work is hard.

I worked with Larry Einhorn for a good part of my adult life. Early on, whenever I would go somewhere to give a talk, one of my hosts would claim that he (my host) had originated the idea of using platinum-based therapy for testicular cancer, and it was only by chance that Larry had gotten there first with his PVB regimen. Later on, I met an engineer who claimed to have come up with the idea for the personal computer before Jobs and Woz, only to be stymied by his corporate bosses. I would add that, whenever I have had a brilliant idea, I have only to open a scientific journal to find it in print. Ideas are cheap, and frequently "in the air"—at least in retrospect and in popular memory—but work is hard. PVB and

Apple were the result of a dose of luck, and some smarts, but mostly lots of hard work done at the right time.

2. It's OK to sleep with a hypothesis, but you should never marry one.

This one was shared with me by an astute, if crude, laboratory researcher; it has been attributed to Sir Peter Medawar, the Nobel laureate immunologist. We sometimes think it heroic to battle for a scientific hypothesis that everyone else disagrees with. I call this the "They said Thomas Edison was crazy" phenomenon. But my experience with the few great researchers I have known is that they are thoroughly unsentimental about their hypotheses. They try out one after another, test them as rapidly as possible, and move on if they are wrong. And even when they are right, the great ones are serious pruners, clipping off the unneeded and the extraneous until the truth is revealed in all its glory. A wise variant of this maxim comes from Darwin's bulldog, Thomas Huxley: "Another beautiful hypothesis murdered by a nasty, brutish fact."

3. A good drug makes everyone look smart.

A decade ago, melanoma doctors were considered lower life forms by breast cancer doctors. Now breast cancer doctors have serious melanoma envy. Melanoma doctors have not gotten smarter, nor have breast cancer researchers voluntarily undergone frontal lobotomies. Checkpoint inhibitors and vemurafenib came along, serving as chemoattractants for the fresh talent flooding the field. New England Journal of Medicine papers started popping up every 2 weeks. Lots of great science, too, but a good drug makes everyone look smart.

4. No one is smarter than a phase III trial.

A few years ago, I was chair of the ECOG Breast Cancer Committee. We had just completed E2103, our phase III metastatic breast cancer trial of anti-angiogenic therapy, with a doubling of progression-free survival in the metastatic setting. It being the height of the "wisdom of crowds" fad, I polled the committee, asking them to predict the results of our just-started adjuvant trial. Ninety-four percent of the attendees felt it would be a positive phase III trial. I was part of that crowd. With such an overwhelming vote, why even bother to conduct the study? Well, because…we were not as smart as a phase III trial. We seriously didn't have a clue.

The other common variant on this phenomenon occurs when the phase III trial is positive. Almost immediately, individuals start individualizing: altering doses and schedules, substituting drugs, changing duration of therapy. These alterations lead to either increased toxicity or decreased efficacy or both. Alternatively, we discount the results of the trial because it differs from our personal experience or biases. We think we are smarter than a phase III trial. Repeat after me: no one is smarter than a phase III trial.5.

5. Scientific papers are the first rough draft of scientific history.

With apologies to The Washington Post's Ben Bradlee, of Watergate fame, to whom the original version of this maxim is attributed, minus the "scientific." A great research finding is rarely the end of the story.

More often it is the middle, and sometimes only the beginning of a new trail of discovery, a jumping-off point. I have had numerous colleagues, most smarter than I am, who fell into "analysis paralysis," trying to create the perfect scientific paper, one for the ages. They usually get scooped by the guy writing "the first rough draft" of scientific history.

6. Quantity has a quality all of its own.

The scientific literature is mostly crud. This is true both in the lab and the clinic. Lack of reproducibility is endemic. One of the major reasons for this is the curse of small numbers: cell line experiments that use only one cell line (rather like treating a single patient and expecting to discover a universal truth), animal experiments with a few lonely mice, clinical trials with suspicious "descriptive" statistics. Small numbers studies are particularly prone to overcalling results, and we, lemming-like, follow them over the cliff in our ever-recurring credulity.

Last year, I heard a presentation where a new combination therapy was said to have an 89 percent response rate. Translated into English this meant that, in a preliminary, unaudited analysis, eight of nine patients had shown initial signs of response. The next time I heard the study presented (more patients, longer follow-up, external review of results) the response rate had dropped to 54 percent. Still respectable, but not the second coming of the Salk vaccine. Quantity has a quality all of its own.

7. Biology is destiny.

Having lived through over 3 decades of research, I have seen many promising therapies come and go. The ones that have stayed have almost always been biology-based. This was not obvious for the first half of my career, a period dominated by the pursuit of ever longer acronyms, ever-higher doses, and application of therapies based on "unmet medical need," a euphemism for "we can sell a lot of drugs for this bad cancer." Though "Toothpaste A + Toothpaste B" empiricism regularly rears its ugly head, the last 15 years have seen the long-overdue triumph of biology in cancer medicine. By the way, this isn't because we were stupid then and we are smart now. We have better tools, and hard-won knowledge. We're still quite capable of stupid.

8. Biology is messy.

If biology is destiny, that doesn't mean it is either easy or clean. Biology is gloriously messy. We regularly get confused. Like the drunk who uses the lamppost for support rather than illumination, we latch on to some biologic story, holding on for dear life, often long after the story is shown to be incomplete or even flat out wrong. Even with a "clean" drug like trastuzumab for HER2-positive breast cancer, we spend the good part of 2 decades arguing over exactly how it works. This doesn't bother me a bit. I love messy. We're cancer biologists, not physicists, after all. Not everything has a single, simple explanation. Remember Walt Whitman: "Do I contradict myself? Very well, then I contradict myself, I am large, I contain multitudes."

9. Knowledge isn't power.

I know, Francis Bacon is rolling over in his grave right now. But knowing isn't enough unless you know enough. We've known that KRAS is really important in pancreatic cancer for decades, and as far as I can tell it hasn't improved anyone's overall survival, because we haven't been able to shut KRAS down outside the laboratory. Knowledge, let me repeat, is not power: successful application of knowledge is power. Take BRAF in melanoma. Remember sorafenib? The drug that was going to change the face of metastatic melanoma? But vemurafenib isn't half bad (well, only 49% bad, since it doesn't work all that long, but who's counting?). Or PARP inhibition for breast and ovarian cancer: remember BiPar's iniparib, which wasn't really a PARP inhibitor? Bad tools won't test the hypothesis, but that doesn't mean the hypothesis is wrong. Another Bacon quote is preferable: "Nature to be commanded must be obeyed."

10. Nice drugs get beat up in bad neighborhoods.

I cannot tell you how many times, over the years, I went to a cooperative group meeting, or an advisory board, and heard a proposal to try a new drug in fifth-line therapy for this disease, or the same drug in that disease, where absolutely nothing ever works. In the former case, the usual explanation is that we have all of these (lousy, inadequate, toxic, borderline useless FDA-approved historical accidents of) drugs that we have to try first. In the latter case, "unmet medical need" is the excuse for hoping that a drug will magically transform a disease for no other reason than we want it to do so. Hope is not a strategy. Hard targets are hard for a reason. Nice drugs get beat up in bad neighborhoods.​

I could go on, but 10 is a nice number, as Moses understood. And I've run out of fingers.


Friday, July 8, 2016

One of the more amazing results of the genomic revolution has been the window it opened on our past. It is only a few years since the discovery of Neanderthal genes in the modern Eurasian genome, as well as the mysterious Denisovan genes present in modern Melanesian islanders. For most of my life, Neanderthals were hulking cave men, long extinct, their very name a term of abuse. Now, every morning when I shave, a Neanderthal face stares back at me from the mirror. My lack of photogenicity finally has a rational explanation.

But it turns out there is a great deal more. The last year has seen an explosion in genomic studies illuminating our past, all quite fascinating, and certainly unexpected.

Let's start with an obvious Charles Darwin sort of question. Neanderthals and our main line ancestors split apart half a million years or so ago. They met again on several occasions, and those happy reunions resulted in our current 2 percent or so of Neanderthalism. The 2 percent, by the way, appears to be down from 6 percent back in the day, if we look at ancient human genomes.

Are our legacy Neanderthal genes randomly distributed throughout our genome, or have they been selected for? No question, and no surprise: selected for.

A History of Genes

Let's start with the Y chromosome, that puny little thing, that living rebuke to Freud's concept of penis envy. It has an important tale to tell, for like the mitochondrial genome in women, it provides an easy gender-specific snapshot for men.

The modern human genome has no Neanderthal Y chromosome genes. None whatsoever, as far as we can observe. They were eliminated. This could represent random loss over time, but a recent deep dive into the Neanderthal Y chromosome suggests a probable culprit for this ethnic cleansing. The Neanderthal Y chromosome contains three minor histocompatibility antigens capable of triggering an immune response known to contribute to miscarriages.

This work adds to earlier data suggesting so-called "genomic deserts" for Neanderthal genes on the X chromosome and for genes governing testicular function, painting a picture of what is known as "hybrid infertility." Humans and Neanderthals lived right on the edge when it came to prospects for interbreeding.

What did we get from our Neanderthal ancestors? A nasty bunch of disease predispositions, for one thing. If one looks at single nucleotide polymorphisms (SNPs) derived from them, and then crosses those SNPs with the electronic health records of 28,000 adults of European descent, one sees sizeable increased risks for depression, actinic keratosis, hypercoagulable states, and tobacco addiction. I have this weird vision of a moody, cigar-smoking, warfarin-ingesting, bearskin-clad hulk with a bad complexion hunting wooly mammoths on the Eurasian steppes.

By the way, the paper that showed these things is quite wonderful, linking the electronic health records of 28,000 adults across numerous medical institutions to large SNP databases for those individuals to the Neanderthal genome. It is CancerLINQ for Neanderthals. The ready transportability of health data and its cross-linking with genome datasets for research purposes suggests they were not using the dominant electronic health record system found in the San Francisco Bay area.

On the plus side, Neanderthal genes appear to provide us protection against parasites. Our modern genome gets several members of the Toll-Like Receptor family (TLR1, TLR6 and TLR9) from our Neanderthal ancestors, and these immune receptors are valuable activators of our adaptive immune responses. There's a growing literature on Toll-Like Receptor's roles in antitumor immunity, so perhaps one day soon we'll enroll Neanderthal cell surface receptors in the fight against cancer. Neanderthal genes also affect our skin: if someone says you are "thick-skinned," your Neanderthal traits are being praised.

Speaking of the Y chromosome, one recent study looked at the modern Y chromosome in 1,200 men from 26 modern populations. All living men descend from a man living 190,000 years ago, the male equivalent of "mitochondrial Eve." This is unsurprising. What was of interest was evidence for male population explosions on five continents at different times over the past 100,000 years, with rapid expansion of specific male lineages in just a few generations.

Half of all men of Western European origin, for instance, descend from just one guy living 4,000 years ago. The authors of the Nature Genetics paper tried to imagine why this might have occurred: introduction of a new technology related to the agricultural revolution, perhaps. I'm doubtful that my ancestor proto-George (from the Greek Γεωργιος (Georgios) which was derived from γεωργος (georgos) meaning "farmer, earthworker") was some peaceful, humble farm boy who invented the plow. Far more likely that he was some prolific thug with a gang, or whatever passed for a gang in 2,000 BC in France or Germany. Genghis Khan before Genghis Khan, lost to history but remembered in our genes. ​

Tracking Viral DNA

Neanderthals aren't the only interlopers in the modern human genome. Let's look at our viral genome. One of the great discoveries associated with the Human Genome Project was the large amount of viral DNA incorporated into human DNA, fully 8 percent or the whole.

For quite some time, this book was shelved under "Junk DNA" in the genome library. But no more. Work published in Science shows that the viral DNA (from so-called "endogenous viruses") is not randomly inserted in the human genome. Rather, it clusters, and clusters around those parts of the genome associated with immune function. Here's where the story gets really interesting. Our genomes appear to have co-opted the viral invaders for the purpose of—you guessed it—fighting off other viruses.

When the authors of the paper used CRISPR/Cas9 technology to snip out endogenous viruses close to the AIM2 immune gene, it no longer responded effectively to the interferon "alarm signal" released in response to a viral infection. Immune function suffered. Endogenous viruses are the family dog who protects villagers from marauding wolves, barking loudly whenever their genetic cousins come near.

I find this thought pleasant, and somehow comforting: we've been assimilating unwanted immigrants into our chromosomal society far longer than humans have existed. Not only are endogenous viruses not "junk DNA", they are valuable, well-behaved, even life-saving first responders.

What really amazes about all this is how much we learn about our history sitting in a laboratory. Our DNA is a time machine, traveling to places no historian can reach. Dig up some old bones from a cave in Spain or Siberia and wonderful mysteries are revealed. Those old bones still matter, eons after our ancestors shuffled off this mortal plane.

One last look at the Neanderthals before we go. Trying to get inside the heads of extinct ancestors is a dangerous enterprise, but enjoyable nevertheless. Recently, far inside Bruniquel Cave in southwestern France, investigators came across a fascinating human structure. The cave's occupants had broken off some 400 stalagmites and then rearranged them in a circle. These circles show signs of fire use, with soot-blackened, heat-fractured calcite and bone remnants. Using uranium-thorium dating techniques, the circles date out to 176,500 years, long before Homo sapiens entered Europe. Neanderthal spelunkers created the structures.

What were they doing there? Did the stalagmite circles have some ritual significance, some religious meaning? Or did the Neanderthals, like modern humans gathered around a flat screen TV, just like to have a family room where they could kick back and enjoy themselves after a long day hunting bison and woolly mammoths? We'll never know, but it makes me think we might have understood each other.