Oncology Times

Skip Navigation LinksHome > Blogs > Musings of a Cancer Doctor
Musings of a Cancer Doctor
Wide-ranging views and perspective from George W. Sledge, Jr., MD
Tuesday, October 25, 2016

Every now and then I see an article trumpeting "the oldest" this or that—sometimes human, sometimes a non-human vertebrate, sometimes a tree. The "oldest human" competition is one you never actually want to win. It seems something akin to "longest stretch on death row": an achievement, no doubt, but not one that you would feel comfortable owning.

Still, these stories have their own charms. Recently, some Danish biologists anointed the Greenland shark as "the oldest living vertebrate." Greenland sharks, as their name suggests, hug the coast of Greenland in the North Atlantic and Arctic Ocean. Why they do not, like Canadians in January, migrate to the Florida Keys is a mystery. What science knew about them until quite recently was limited: they are quite long (up to six meters) and they grow at a glacial pace (a few centimeters at most per year) and swim quite slowly (1.22 km/h). Katie Ledecky would have no trouble swimming faster than your average Greenland shark.

In an attempt to get at the age of the Greenland shark, investigators carbon-14 dated a shark's eye lenses. Why the lens? Because Greenland shark lenses do not change after birth. To their very great astonishment, they came up with a life expectancy of 392 +/- 120 years. There may be Greenland sharks swimming in the Arctic that were born around the same time as William Shakespeare.

An interesting side note: Greenland sharks have high concentrations of trimethylamine oxide and urea in their flesh. The trimethylamine oxide (an adaptation to living in deep waters) is converted to trimethylamine when you eat a Greenland shark, and can make you quite drunk. Kæstur hákarl, an Icelandic delicacy made from the Greenland shark, smells like urine and has been described by Anthony Bordain as "the single worst, most disgusting and terrible tasting thing" he had ever eaten. Also, Greenland sharks don't reach sexual maturity until they are 150 years old. Or maybe they just can't stand the smell of each other.

So, to summarize: if you are a Greenland shark you live in the darkest, coldest water on the planet, you smell bad, you move very slowly, and you cannot get a date until midway through your second century. Why would you want to live four centuries? Because you can, apparently.

The World's Oldest

The "oldest living" list for animals includes: tortoise (250 years), lizard (117 years), and koi (226 years). Vertebrates don't match up all that well against invertebrates, alas: a clam named Ming lived 507 years, and would have lived longer except that, as a delightful USA Today headline stated, "Scientists accidentally kill world's oldest animal."

Note to old folks: stay away from them scientists.

Trees do even better than animals in "the oldest living" competition. The winner for "oldest tree" is a Norway spruce named "Old Tjikko" that lives in the harsh terrain of Sweden's Dalarna province. It clocks in at 9,550 years, though it is something of cheat for "world's oldest" in that it is a vegetatively cloned tree. The tree we see is relatively young (the tree trunk only lives for 600 years or so) and is a sprout off of the ancient root system. Old Tjikko is named after the dead dog of its geographer discoverer, not that it matters.

A Bosnian pine holds the European record for non-clonal trees at ~1,075 years old. The oldest non-clonal tree in the world is said to be a bristlecone pine tree from California's White Mountains named Methuselah that is around 5,000 years old. There was an even older one named Prometheus that (to quote a Mental Floss article on old trees) "was accidentally felled by a scientist who didn't realize the tree was as old as it was." I'm beginning to see a pattern here: per William Blake, "We murder to dissect."

Forever Young?

What all of these "world's oldest" seem to have in common is that they live in the world's barren places: mountains, deserts, the bottom of the Arctic Ocean. Not much grows there, and what grows does so at a leisurely pace. A Nature paper from 1999, entitled "Ancient stunted trees on Cliffs," found that vertical cliffs around the world "support populations of widely spaced trees that are exceptionally old, deformed and slow growing." These trees have radial growth of less than 1 mm per year.

Stunting your growth, then, is a good thing if you want to live forever. That, at least, seems to be the lesson from natural history studies. But it is also true at the mammalian end of the spectrum. Starve mice of calories and they live significantly longer.

Caloric restriction is probably a good idea for human as well, though it is unlikely to double one's survival, let alone allow a Methuselah-like existence. I've written, in previous blogs, about anti-senescence drugs (senolytics) and blood transfusions from the young as ways of warding off death. The latter derives from experiments in which a young mouse and a senescent mouse are sewn together, a process known as parabiosis, with resultant de-aging of the old guy. Now, it seems, such approaches are becoming popular among the wealthy.

Peter Thiel, a Silicon Valley tech investor best known for speaking at Donald Trump's Republican National Convention earlier this year, is an investor in Ambrosia. Ambrosia is a Monterey, Calif., company that has started a trial called "Young Donor Plasma Transfusion and Age-Related Biomarkers." As trials go, it is unusual: you pay the company $8,000 to participate. For anyone between the ages of 35 and 80, participation involves receiving the blood of plasma donors ages 16 to 25.

I looked up Ambrosia's trial (NCT02803554, in case you are interested) on ClinicalTrials.gov. The trial design is pretty simple: draw some blood, infuse plasma, draw some more blood a month later.

Lots of blood. The blood drawn is examined for biomarkers of aging, including—and this is only about a third of what is being tested—Interleukin-5, Interleukin-6, Interleukin-7, Interleukin-8, Leptin, Macrophage Inflammatory Protein-1 alpha, Macrophage Inflammatory Protein-1 beta, Macrophage-Derived Chemokine, Matrix Metalloproteinase-2, Matrix Metalloproteinase-3, Matrix Metalloproteinase-9, Monocyte Chemotactic Protein 1, Myeloperoxidase, Myoglobin, Neuron-Specific Enolase, Plasminogen Activator Inhibitor 1, Prostate-Specific Antigen, Free, Pulmonary and Activation-Regulated Chemokine, Serum Amyloid P-Component, Stem Cell Factor, T-Cell-Specific Protein RANTES, Thrombospondin-1, Thyroid-Stimulating Hormone, Thyroxine-Binding Globulin, Tissue Inhibitor of Metalloproteinases 1, Transthyretin, Tumor Necrosis Factor alpha, Tumor Necrosis Factor beta, Tumor necrosis factor receptor 2, Vascular Cell Adhesion Molecule-1, Endothelial Growth Factor, Vitamin D-Binding Protein, von Willebrand Factor.

You pretty much need the plasma infusion to survive the bloodletting. The clinical trialist in me says that the next time I drive to Monterey it will be to visit the aquarium. And would shooting up with a 20 year-old's plasma really be all that useful? What if the plasma you needed was from a 6-month-old? Ambrosia is looking to recruit 600 patients: we'll see how they do.

Ambrosia, you will remember, was the nectar of the gods in ancient Greece. Our modern gods appear terrified by the prospect that their billions will not suffice to render them similarly immortal, hence their interest in anti-aging technology. Meanwhile, another Silicon Valley company, Alkahest, is performing the PLASMA (PLasma for Alzheimer SymptoM Amelioration) Study with neurologists at Stanford. According to ClinicalTrials.gov, this one has more modest aims, being primarily a feasibility study with a few dementia metrics thrown in as secondary endpoints. Let's hope it works. I'm burning through cortical neurons at an alarming rate.

Anyways, modern science is now in hot pursuit of anti-aging strategies. Whether the pursuit of Greenland sharkdom or Ming the Clam or Methuselah the bristlecone pine is worthwhile I will leave to your philosophical imagination. I tend to side with Gilgamesh of Uruk on this:​

What you seek you shall never find.

F​or when the Gods made man,

They kept immortality to themselves.

Fill your belly.

Day and night make merry.

Let Days be full of joy.

Love the child who holds your hand.

Let your wife delight in your embrace.

For these alone are the concerns of man.

Monday, September 12, 2016

I offer, for the edification of my readers, a few thoughts I've had about cancer research, garnered from 3-plus decades in the lab and the clinic. I cannot claim originality, as some of these are flat out steals, and I suspect the rest probably have been thought of by others. But I offer them in the hope that they may be of some interest.

1. Ideas are cheap. Work is hard.

I worked with Larry Einhorn for a good part of my adult life. Early on, whenever I would go somewhere to give a talk, one of my hosts would claim that he (my host) had originated the idea of using platinum-based therapy for testicular cancer, and it was only by chance that Larry had gotten there first with his PVB regimen. Later on, I met an engineer who claimed to have come up with the idea for the personal computer before Jobs and Woz, only to be stymied by his corporate bosses. I would add that, whenever I have had a brilliant idea, I have only to open a scientific journal to find it in print. Ideas are cheap, and frequently "in the air"—at least in retrospect and in popular memory—but work is hard. PVB and

Apple were the result of a dose of luck, and some smarts, but mostly lots of hard work done at the right time.

2. It's OK to sleep with a hypothesis, but you should never marry one.

This one was shared with me by an astute, if crude, laboratory researcher; it has been attributed to Sir Peter Medawar, the Nobel laureate immunologist. We sometimes think it heroic to battle for a scientific hypothesis that everyone else disagrees with. I call this the "They said Thomas Edison was crazy" phenomenon. But my experience with the few great researchers I have known is that they are thoroughly unsentimental about their hypotheses. They try out one after another, test them as rapidly as possible, and move on if they are wrong. And even when they are right, the great ones are serious pruners, clipping off the unneeded and the extraneous until the truth is revealed in all its glory. A wise variant of this maxim comes from Darwin's bulldog, Thomas Huxley: "Another beautiful hypothesis murdered by a nasty, brutish fact."

3. A good drug makes everyone look smart.

A decade ago, melanoma doctors were considered lower life forms by breast cancer doctors. Now breast cancer doctors have serious melanoma envy. Melanoma doctors have not gotten smarter, nor have breast cancer researchers voluntarily undergone frontal lobotomies. Checkpoint inhibitors and vemurafenib came along, serving as chemoattractants for the fresh talent flooding the field. New England Journal of Medicine papers started popping up every 2 weeks. Lots of great science, too, but a good drug makes everyone look smart.

4. No one is smarter than a phase III trial.

A few years ago, I was chair of the ECOG Breast Cancer Committee. We had just completed E2103, our phase III metastatic breast cancer trial of anti-angiogenic therapy, with a doubling of progression-free survival in the metastatic setting. It being the height of the "wisdom of crowds" fad, I polled the committee, asking them to predict the results of our just-started adjuvant trial. Ninety-four percent of the attendees felt it would be a positive phase III trial. I was part of that crowd. With such an overwhelming vote, why even bother to conduct the study? Well, because…we were not as smart as a phase III trial. We seriously didn't have a clue.

The other common variant on this phenomenon occurs when the phase III trial is positive. Almost immediately, individuals start individualizing: altering doses and schedules, substituting drugs, changing duration of therapy. These alterations lead to either increased toxicity or decreased efficacy or both. Alternatively, we discount the results of the trial because it differs from our personal experience or biases. We think we are smarter than a phase III trial. Repeat after me: no one is smarter than a phase III trial.5.

5. Scientific papers are the first rough draft of scientific history.

With apologies to The Washington Post's Ben Bradlee, of Watergate fame, to whom the original version of this maxim is attributed, minus the "scientific." A great research finding is rarely the end of the story.

More often it is the middle, and sometimes only the beginning of a new trail of discovery, a jumping-off point. I have had numerous colleagues, most smarter than I am, who fell into "analysis paralysis," trying to create the perfect scientific paper, one for the ages. They usually get scooped by the guy writing "the first rough draft" of scientific history.

6. Quantity has a quality all of its own.

The scientific literature is mostly crud. This is true both in the lab and the clinic. Lack of reproducibility is endemic. One of the major reasons for this is the curse of small numbers: cell line experiments that use only one cell line (rather like treating a single patient and expecting to discover a universal truth), animal experiments with a few lonely mice, clinical trials with suspicious "descriptive" statistics. Small numbers studies are particularly prone to overcalling results, and we, lemming-like, follow them over the cliff in our ever-recurring credulity.

Last year, I heard a presentation where a new combination therapy was said to have an 89 percent response rate. Translated into English this meant that, in a preliminary, unaudited analysis, eight of nine patients had shown initial signs of response. The next time I heard the study presented (more patients, longer follow-up, external review of results) the response rate had dropped to 54 percent. Still respectable, but not the second coming of the Salk vaccine. Quantity has a quality all of its own.

7. Biology is destiny.

Having lived through over 3 decades of research, I have seen many promising therapies come and go. The ones that have stayed have almost always been biology-based. This was not obvious for the first half of my career, a period dominated by the pursuit of ever longer acronyms, ever-higher doses, and application of therapies based on "unmet medical need," a euphemism for "we can sell a lot of drugs for this bad cancer." Though "Toothpaste A + Toothpaste B" empiricism regularly rears its ugly head, the last 15 years have seen the long-overdue triumph of biology in cancer medicine. By the way, this isn't because we were stupid then and we are smart now. We have better tools, and hard-won knowledge. We're still quite capable of stupid.

8. Biology is messy.

If biology is destiny, that doesn't mean it is either easy or clean. Biology is gloriously messy. We regularly get confused. Like the drunk who uses the lamppost for support rather than illumination, we latch on to some biologic story, holding on for dear life, often long after the story is shown to be incomplete or even flat out wrong. Even with a "clean" drug like trastuzumab for HER2-positive breast cancer, we spend the good part of 2 decades arguing over exactly how it works. This doesn't bother me a bit. I love messy. We're cancer biologists, not physicists, after all. Not everything has a single, simple explanation. Remember Walt Whitman: "Do I contradict myself? Very well, then I contradict myself, I am large, I contain multitudes."

9. Knowledge isn't power.

I know, Francis Bacon is rolling over in his grave right now. But knowing isn't enough unless you know enough. We've known that KRAS is really important in pancreatic cancer for decades, and as far as I can tell it hasn't improved anyone's overall survival, because we haven't been able to shut KRAS down outside the laboratory. Knowledge, let me repeat, is not power: successful application of knowledge is power. Take BRAF in melanoma. Remember sorafenib? The drug that was going to change the face of metastatic melanoma? But vemurafenib isn't half bad (well, only 49% bad, since it doesn't work all that long, but who's counting?). Or PARP inhibition for breast and ovarian cancer: remember BiPar's iniparib, which wasn't really a PARP inhibitor? Bad tools won't test the hypothesis, but that doesn't mean the hypothesis is wrong. Another Bacon quote is preferable: "Nature to be commanded must be obeyed."

10. Nice drugs get beat up in bad neighborhoods.

I cannot tell you how many times, over the years, I went to a cooperative group meeting, or an advisory board, and heard a proposal to try a new drug in fifth-line therapy for this disease, or the same drug in that disease, where absolutely nothing ever works. In the former case, the usual explanation is that we have all of these (lousy, inadequate, toxic, borderline useless FDA-approved historical accidents of) drugs that we have to try first. In the latter case, "unmet medical need" is the excuse for hoping that a drug will magically transform a disease for no other reason than we want it to do so. Hope is not a strategy. Hard targets are hard for a reason. Nice drugs get beat up in bad neighborhoods.​

I could go on, but 10 is a nice number, as Moses understood. And I've run out of fingers.

Friday, July 8, 2016

One of the more amazing results of the genomic revolution has been the window it opened on our past. It is only a few years since the discovery of Neanderthal genes in the modern Eurasian genome, as well as the mysterious Denisovan genes present in modern Melanesian islanders. For most of my life, Neanderthals were hulking cave men, long extinct, their very name a term of abuse. Now, every morning when I shave, a Neanderthal face stares back at me from the mirror. My lack of photogenicity finally has a rational explanation.

But it turns out there is a great deal more. The last year has seen an explosion in genomic studies illuminating our past, all quite fascinating, and certainly unexpected.

Let's start with an obvious Charles Darwin sort of question. Neanderthals and our main line ancestors split apart half a million years or so ago. They met again on several occasions, and those happy reunions resulted in our current 2 percent or so of Neanderthalism. The 2 percent, by the way, appears to be down from 6 percent back in the day, if we look at ancient human genomes.

Are our legacy Neanderthal genes randomly distributed throughout our genome, or have they been selected for? No question, and no surprise: selected for.

A History of Genes

Let's start with the Y chromosome, that puny little thing, that living rebuke to Freud's concept of penis envy. It has an important tale to tell, for like the mitochondrial genome in women, it provides an easy gender-specific snapshot for men.

The modern human genome has no Neanderthal Y chromosome genes. None whatsoever, as far as we can observe. They were eliminated. This could represent random loss over time, but a recent deep dive into the Neanderthal Y chromosome suggests a probable culprit for this ethnic cleansing. The Neanderthal Y chromosome contains three minor histocompatibility antigens capable of triggering an immune response known to contribute to miscarriages.

This work adds to earlier data suggesting so-called "genomic deserts" for Neanderthal genes on the X chromosome and for genes governing testicular function, painting a picture of what is known as "hybrid infertility." Humans and Neanderthals lived right on the edge when it came to prospects for interbreeding.

What did we get from our Neanderthal ancestors? A nasty bunch of disease predispositions, for one thing. If one looks at single nucleotide polymorphisms (SNPs) derived from them, and then crosses those SNPs with the electronic health records of 28,000 adults of European descent, one sees sizeable increased risks for depression, actinic keratosis, hypercoagulable states, and tobacco addiction. I have this weird vision of a moody, cigar-smoking, warfarin-ingesting, bearskin-clad hulk with a bad complexion hunting wooly mammoths on the Eurasian steppes.

By the way, the paper that showed these things is quite wonderful, linking the electronic health records of 28,000 adults across numerous medical institutions to large SNP databases for those individuals to the Neanderthal genome. It is CancerLINQ for Neanderthals. The ready transportability of health data and its cross-linking with genome datasets for research purposes suggests they were not using the dominant electronic health record system found in the San Francisco Bay area.

On the plus side, Neanderthal genes appear to provide us protection against parasites. Our modern genome gets several members of the Toll-Like Receptor family (TLR1, TLR6 and TLR9) from our Neanderthal ancestors, and these immune receptors are valuable activators of our adaptive immune responses. There's a growing literature on Toll-Like Receptor's roles in antitumor immunity, so perhaps one day soon we'll enroll Neanderthal cell surface receptors in the fight against cancer. Neanderthal genes also affect our skin: if someone says you are "thick-skinned," your Neanderthal traits are being praised.

Speaking of the Y chromosome, one recent study looked at the modern Y chromosome in 1,200 men from 26 modern populations. All living men descend from a man living 190,000 years ago, the male equivalent of "mitochondrial Eve." This is unsurprising. What was of interest was evidence for male population explosions on five continents at different times over the past 100,000 years, with rapid expansion of specific male lineages in just a few generations.

Half of all men of Western European origin, for instance, descend from just one guy living 4,000 years ago. The authors of the Nature Genetics paper tried to imagine why this might have occurred: introduction of a new technology related to the agricultural revolution, perhaps. I'm doubtful that my ancestor proto-George (from the Greek Γεωργιος (Georgios) which was derived from γεωργος (georgos) meaning "farmer, earthworker") was some peaceful, humble farm boy who invented the plow. Far more likely that he was some prolific thug with a gang, or whatever passed for a gang in 2,000 BC in France or Germany. Genghis Khan before Genghis Khan, lost to history but remembered in our genes. ​

Tracking Viral DNA

Neanderthals aren't the only interlopers in the modern human genome. Let's look at our viral genome. One of the great discoveries associated with the Human Genome Project was the large amount of viral DNA incorporated into human DNA, fully 8 percent or the whole.

For quite some time, this book was shelved under "Junk DNA" in the genome library. But no more. Work published in Science shows that the viral DNA (from so-called "endogenous viruses") is not randomly inserted in the human genome. Rather, it clusters, and clusters around those parts of the genome associated with immune function. Here's where the story gets really interesting. Our genomes appear to have co-opted the viral invaders for the purpose of—you guessed it—fighting off other viruses.

When the authors of the paper used CRISPR/Cas9 technology to snip out endogenous viruses close to the AIM2 immune gene, it no longer responded effectively to the interferon "alarm signal" released in response to a viral infection. Immune function suffered. Endogenous viruses are the family dog who protects villagers from marauding wolves, barking loudly whenever their genetic cousins come near.

I find this thought pleasant, and somehow comforting: we've been assimilating unwanted immigrants into our chromosomal society far longer than humans have existed. Not only are endogenous viruses not "junk DNA", they are valuable, well-behaved, even life-saving first responders.

What really amazes about all this is how much we learn about our history sitting in a laboratory. Our DNA is a time machine, traveling to places no historian can reach. Dig up some old bones from a cave in Spain or Siberia and wonderful mysteries are revealed. Those old bones still matter, eons after our ancestors shuffled off this mortal plane.

One last look at the Neanderthals before we go. Trying to get inside the heads of extinct ancestors is a dangerous enterprise, but enjoyable nevertheless. Recently, far inside Bruniquel Cave in southwestern France, investigators came across a fascinating human structure. The cave's occupants had broken off some 400 stalagmites and then rearranged them in a circle. These circles show signs of fire use, with soot-blackened, heat-fractured calcite and bone remnants. Using uranium-thorium dating techniques, the circles date out to 176,500 years, long before Homo sapiens entered Europe. Neanderthal spelunkers created the structures.

What were they doing there? Did the stalagmite circles have some ritual significance, some religious meaning? Or did the Neanderthals, like modern humans gathered around a flat screen TV, just like to have a family room where they could kick back and enjoy themselves after a long day hunting bison and woolly mammoths? We'll never know, but it makes me think we might have understood each other.

Friday, May 27, 2016

One of the delights of science is how captivating the results can be. About 1.3 billion years ago, and incredibly far away, two black holes spiraled together, collided at half the speed of light, coalesced into a single black hole, and in that joining three suns worth of mass were turned into energy in the form of gravitational waves. In September of last year the sound of that collision (literally a sound: you can hear it at https://www.youtube.com/watch?v=TWqhUANNFXw) was recorded by the Advanced Laser Interferometer Gravitational-Wave Observatory's (LIGO) parallel scientific instruments in Washington and Louisiana. The waves reached Louisiana 7 milliseconds before Washington, suggesting a location in the Southern hemisphere.

The result, predicted 100 years ago by Albert Einstein, simultaneously proves the existence of black holes and launches gravitational astronomy as a new field of scientific endeavor.

How wonderful.

Gravity is both incredibly weak and impressively strong. It is the weakest of the four fundamental physical forces in nature, some 38 orders of magnitude weaker than the strong force, 36 orders of magnitude weaker than the electromagnetic force, and 29 orders of magnitude weaker than the weak force. But at the macroscopic scale it dominates, affecting the trajectory of heavenly bodies and of humans walking around here on earth. Try jumping into the sky and see how far you get

What we know about gravity results largely from the scientific work of the two smartest humans who ever lived, Isaac Newton and Albert Einstein. Newton discovered the inverse-square law of universal gravitation, summarized in this equation:


Put another way, Newton's law of universal gravitation states (and here I quote Wikipedia) that any two bodies in the universe attract each other with a force that is directly proportional to the product of their masses and inversely proportional to the square of the distance between them.

Before Newton, gravity meant something different: something serious. But the term derives from the Latin "gravitas," which means weight, or heaviness, and this was the meaning Newton called upon. Newton's law does quite well for predicting most of what we experience, and its application to our solar system led to the discovery of Neptune in the 19th century.

This worked well enough for two centuries, until astronomers found it could not explain the orbit of Mercury. This discrepancy between theory and fact led Einstein to think about gravity, and to his field equations, principle among them:


 I do not remember any of this from my sole semester of college physics in the early 1970s, but Wikipedia assures me these are true equations. I do remember the thought experiments that represented the beginning of the two great theories attached to these great discoveries: Newton's apple, Einstein's elevator.

One important consequence predicted by Einstein's equations is the existence of gravity waves, caused by distortions in the space-time continuum around large masses. Einstein himself was uncertain that such waves would ever be detected, or even whether they existed. But now, thanks to LIGO, we are certain they are real.

 Gravity & Health

I don't usually think about gravity when I see my patients in clinic, though like many things I don't think about, gravity is always there in the background. It weighs us down.

Recently, astronaut Kelly Scott came back from a year spent at the International Space Station some two inches taller than when he had left. His spinal disks, not as weighed down, expanded, though they reverted as gravity reasserted itself. Scott is one of a pair of twins, allowing NASA scientists a mini-randomized controlled trial of low-G effects on human physiology.

Scott might end up shorter than when he left Earth, and certainly weaker. Astronauts lose muscle mass and bone density. Spend 6 months off-planet and you will lose 10 percent of your bone mass. Even the hearts shrink a bit while in orbit, leaving the returning astronaut faint with exertion. We evolved living at the bottom of a gravity well. Sit on top of the well and molecular adjustments occur. NASA worries about these things, as it would like to send astronauts to Mars some day.

Low gravity's molecular effects aren't perfectly well understood. That old biological workhorse (workworm?) Caenorhabditis elegans has been compared on ground and in space. One study saw down-regulation of numerous genes, whose cumulative effects prolong C. elegans lifespan. Another saw decreased accumulation of toxic protein aggregates in ageing worms' muscles. Space was made for worms searching for immortality.

There is a fascinating plastic surgery literature on gravity. Here's a fun little experiment reported by two groups (Aesthet Surg J 2014;34:809-822, and Int J Cosmet Sci 2015;37:291-7). Take patients who show signs of midface aging and have them either stand up or lie down. Take pictures while smiling and in repose. Compare "brow position, ear trough length and depth, steatoblepharon, cheek volume, malar bags/festoons, and nasolabial folds." (Steatoblepharon, by the way, is "the prolapse of fat from the orbit of the eye below the eyelid." A festoon is a chain or garland of flowers, hung in a curve as a decoration, unless you are a plastic surgeon, when it refers to significantly damaged skin in the lower eyelids.)

Then, because perception is as important as measurement if you are a plastic surgeon, have someone estimate their age by looking at the pictures. What happens?

The supine patients measure younger, and look younger. Indeed, if you are over the age of 55, and you want to look younger, never get out of bed. The simple act of lying down makes you look six years younger, on average. Gravity redistributes interstitial fluids, subtly contorting our faces. Remember that the next time you visit a funeral home and hear the magical words "doesn't he look natural?" No, he doesn't. It's just gravity. We sag when we stand. I'd rather sag, thank you very much, as long as I can keep standing.

This redistribution of interstitial fluids occurs in space as well. Fluids no longer pool in the legs, but in the upper body. The body's blood volume sensors are in the upper body, so red blood cell production declines. Anemia is yet another reason to faint when you get back to Earth.

Cancer in Space

Gravity is a boon to plastic surgeons. What about cancer?

When I treat a postmenopausal breast cancer patient with aromatase inhibitor therapy, I routinely discuss the bone loss that results from estrogen deprivation. How does one prevent bone loss? I tell my patients bone is laid down along lines of stress, and that weight-bearing exercise is among the best things they can do to prevent osteoporotic fractures. Though I never mention gravity, I silently invoke its inexorable force.

NASA has spent some small fortune performing experiments with cancer cells in space (examined in Nature Reviews Cancer 2013;13:315-27). Under microgravity conditions, cancer cells form spheroids, sometimes quite large: in one case, golf ball-size organoids of prostate cancer cells that far outpaced their earthbound cousins. Gliomas, in contrast, roll over and die in space, for reasons no one understands.

Virtually all cancer cell types undergo major structural changes in microgravity. Tubulin is assembled into microtubules by gravity-dependent reaction-diffusion processes: get rid of gravity, and the microtubules change their orientation. Would Taxol work differently in space?

So far, I've seen no rush to set up an orbital outpatient oncology clinic. Astronauts do not appear to have a higher rate of cancer incidence or mortality than the rest of us, though the numbers are small and the comparisons highly confounded from a statistical standpoint. But there is, apparently, a thriving trade in pharma-driven space experiments, with a start-up company called Space Tango contracting with around 50 customers this year to perform contract microgravity experiments in tissue-box-size automated labs.

Meanwhile, back on Earth, my ears are peeled for the next echo of two black holes colliding in some distant galaxy. Can you hear it? Let's listen together.

Monday, January 11, 2016

Recently I learned of the death of Frances Kelsey. I was somewhat surprised, not that she had died, but in the “I didn’t know she had lived that long” sort of way one has sometimes in reading the obits. Kelsey had become a national figure when I was in my early teens, at a time when John Kennedy was president, and I had not given her a second’s thought in decades.


Kelsey’s story is inextricably linked to that of thalidomide. It is an old story, and its outlines are familiar to most, but let me share it with you in case you missed it or misplaced the memory. In the 1950s thalidomide was developed as a sleeping pill and was used as such in Europe. It was particularly valued for its benefits in pregnant women. Rights to thalidomide had been sold in the United States to the William S. Merrell Company.




Glass Ceiling

Kelsey was, at the time, a brand-new staffer at the Food and Drug Administration. Prior to this she had performed research at the University of Chicago, where she had earned both a PhD and an MD degree. When she applied to the PhD program at the University of Chicago, the acceptance letter was addressed to “Mr. Oldham” (she had been born Frances Oldham). “When a woman took a job in those days, she was made to feel as if she was depriving a man of the ability to support his wife and child,” she later told an interviewer. Fortunately, her McGill University thesis professor said, “Don’t be stupid. Accept the job, sign your name and put “Miss” in brackets afterward.” She did.


The First Task

Arriving in Chicago in 1936, she was set to work identifying the toxic agent in a batch of sulfanilamide. Initially entering the market in tablet form, the antimicrobial had been converted to liquid form for intravenous use. This new compound had promptly killed 107 patients, many of them children. The modern FDA is a stickler regarding drug formulation, and this dates to the antifreeze-like industrial chemical that had been included in the intravenous sulfa. In 1938 Congress passed the Food, Drug and Cosmetic Act in response to the tragedy.


As a new FDA staffer, Kelsey was tasked with evaluating thalidomide. She would later say that she had been assigned the evaluation not because it was considered a difficult case, but because it was a simple one—a drug already on the market, with thousands of patient-years of experience…a slam-dunk approval, a no-brainer.


Except that Kelsey was meticulous, and had a brain. In pouring through the documentation provided by the company, she was impressed by the lack of information provided by Merrell. The clinical data supplied involved little more than doctor testimonials, which Kelsey later characterized as “too glowing for the support in the way of clinical back-up.”


While the evaluation was ongoing, she noticed a publication in the British Medical Journal suggesting that thalidomide was associated with peripheral neuropathy. Merrell’s packet to the FDA had failed to mention peripheral neuropathy, immediately alerting Kelsey that something might be wrong. In those days, astonishingly, drugs were automatically approved by a certain date unless the FDA specifically disapproved them. Such approval by default seems amazing today, but those were different times. Then, as now, no one at the FDA wanted to disapprove a drug without sufficient cause.


Standing Firm

The out was that the FDA could put an approval on hold if it requested additional information. Kelsey invoked this rule. The Merrell representative, after much grumbling, admitted that yes, there had been some reports of peripheral neuropathy, but the company felt the drug was not to blame, that it was probably related to poor nutrition, and it wasn’t all that big a deal in any event.


Kelsey still demurred and demanded real data. She and her colleagues at the FDA were particularly interested about the effects of the drug on the fetus, given that a pregnant woman might take the drug for months. Merrell reported that it did not know of any problems with the drug in pregnancy, but had not conducted a study. They were anxious to get the drug on the market, and hounded her for a quick approval.  They also pressured her superiors and threatened to go to the Commissioner of the FDA. Kelsey stood firm.


But peripheral neuropathy was the least of thalidomide’s problems. While Kelsey awaited more data, reports began to emerge from Europe tying the agent to birth defects in pregnant women. The famous Hopkins cardiologist Dr. Helen Taussig (the heroes in this story are women) heard of the issue and traveled to Europe to investigate. Babies there were being born, in the thousands, with flipper-like arms and legs, a condition known as phocomelia. The drug also significantly increased the rate of miscarriage. When Taussig came home she met with Kelsey, and later testified about thalidomide in front of a House committee considering strengthening drug approval laws.


In 1962 then-president John F. Kennedy awarded Kelsey the President's Award for Distinguished Federal Civilian Service. At the award ceremony he testified, “Her exceptional judgment in evaluating a new drug for safety for human use has prevented a major tragedy of birth deformities in the United States.” Kelsey, ever modest, consistently maintained that she received the award on behalf of a team. She did not think she had done anything exceptional.


More to the Story

Kelsey’s story did not end there, though that is most anyone has ever heard of her. The outcry over thalidomide, once the story broke, led to passage of the Kefauver-Harris Amendment in 1962 to strengthen drug regulations. The bill had languished in Congress for six years prior to the thalidomide revelations.


To get some sense of how different a time 1962 was from today bear in mind that at the Senate hearing for the amendment, Senator Jacob Javits could ask the question, “Do people know they are getting investigational drugs?” The answer was “no.” Many of the U.S. participants in thalidomide trials were unaware they were receiving a drug that had not been approved by the FDA. Thalidomide changed that. The requirement for informed consent dates to Frances Kelsey. She became the first chief of the Investigational Drug Branch and created the modern process for new drug testing.


Fifteen years ago I sat on the FDA’s Oncology Drugs Advisory Committee, or ODAC. There I developed a deep and abiding respect for FDA staffers as dedicated public servants with an essentially thankless job. If the FDA approves a drug and later has to pull it from the market for reasons of some unexpected and rare but serious toxicity (think: COX-2 inhibitors), they are pilloried for not having done due diligence during the regulatory process and are accused of being stooges for the pharmaceutical industry. If, on the other hand, they delay drug approval, or reject a drug, the Wall Street Journal publishes an editorial titled—and this actually happened—“FDA to Patients: Drop Dead!” Basically, they cannot win.


I have had my own gripes with them over the years: lack of consistency and (on occasion) a seemingly poor understanding of biology. But that does nothing to diminish my respect for the FDA medical officers I have met, serious professionals who could be earning multiples of their government salary by crossing the street into industry. They are proud of Kelsey’s legacy, as they should be.


I suspect—though I do not know this to be the case—that they may sometimes invoke Kelsey’s memory to justify obstructionist behavior, just as some newspaper reporters consider themselves to be Woodward and Bernstein’s progeny whenever they publish some minor expose. But I would far rather live in a world where public servants protect the public good, just as I want a world full of investigative journalists to counter the influence of the rich and powerful.


Frances Kelsey, a tireless public servant, worked until she was 90 and died at the age of 101. She had two children; accurate enough a count if one omits the thousands of babies born with two arms and two legs because of her courage.

About the Author

George W. Sledge, Jr., MD
GEORGE W. SLEDGE, JR., MD, is Chief of Oncology at Stanford University. His OT writing was recognized with an APEX Award for Publication Excellence in the category of “Regular Departments & Columns.”