Musings of a Cancer Doctor
Wide-ranging views and perspective from George W. Sledge, Jr., MD
Friday, July 8, 2016
One of the more amazing results of the genomic revolution has been the window it opened on our past. It is only a few years since the discovery of Neanderthal genes in the modern Eurasian genome, as well as the mysterious Denisovan genes present in modern Melanesian islanders. For most of my life, Neanderthals were hulking cave men, long extinct, their very name a term of abuse. Now, every morning when I shave, a Neanderthal face stares back at me from the mirror. My lack of photogenicity finally has a rational explanation.
But it turns out there is a great deal more. The last year has seen an explosion in genomic studies illuminating our past, all quite fascinating, and certainly unexpected.
Let's start with an obvious Charles Darwin sort of question. Neanderthals and our main line ancestors split apart half a million years or so ago. They met again on several occasions, and those happy reunions resulted in our current 2 percent or so of Neanderthalism. The 2 percent, by the way, appears to be down from 6 percent back in the day, if we look at ancient human genomes.
Are our legacy Neanderthal genes randomly distributed throughout our genome, or have they been selected for? No question, and no surprise: selected for.
A History of Genes
Let's start with the Y chromosome, that puny little thing, that living rebuke to Freud's concept of penis envy. It has an important tale to tell, for like the mitochondrial genome in women, it provides an easy gender-specific snapshot for men.
The modern human genome has no Neanderthal Y chromosome genes. None whatsoever, as far as we can observe. They were eliminated. This could represent random loss over time, but a recent deep dive into the Neanderthal Y chromosome suggests a probable culprit for this ethnic cleansing. The Neanderthal Y chromosome contains three minor histocompatibility antigens capable of triggering an immune response known to contribute to miscarriages.
This work adds to earlier data suggesting so-called "genomic deserts" for Neanderthal genes on the X chromosome and for genes governing testicular function, painting a picture of what is known as "hybrid infertility." Humans and Neanderthals lived right on the edge when it came to prospects for interbreeding.
What did we get from our Neanderthal ancestors? A nasty bunch of disease predispositions, for one thing. If one looks at single nucleotide polymorphisms (SNPs) derived from them, and then crosses those SNPs with the electronic health records of 28,000 adults of European descent, one sees sizeable increased risks for depression, actinic keratosis, hypercoagulable states, and tobacco addiction. I have this weird vision of a moody, cigar-smoking, warfarin-ingesting, bearskin-clad hulk with a bad complexion hunting wooly mammoths on the Eurasian steppes.
By the way, the paper that showed these things is quite wonderful, linking the electronic health records of 28,000 adults across numerous medical institutions to large SNP databases for those individuals to the Neanderthal genome. It is CancerLINQ for Neanderthals. The ready transportability of health data and its cross-linking with genome datasets for research purposes suggests they were not using the dominant electronic health record system found in the San Francisco Bay area.
On the plus side, Neanderthal genes appear to provide us protection against parasites. Our modern genome gets several members of the Toll-Like Receptor family (TLR1, TLR6 and TLR9) from our Neanderthal ancestors, and these immune receptors are valuable activators of our adaptive immune responses. There's a growing literature on Toll-Like Receptor's roles in antitumor immunity, so perhaps one day soon we'll enroll Neanderthal cell surface receptors in the fight against cancer. Neanderthal genes also affect our skin: if someone says you are "thick-skinned," your Neanderthal traits are being praised.
Speaking of the Y chromosome, one recent study looked at the modern Y chromosome in 1,200 men from 26 modern populations. All living men descend from a man living 190,000 years ago, the male equivalent of "mitochondrial Eve." This is unsurprising. What was of interest was evidence for male population explosions on five continents at different times over the past 100,000 years, with rapid expansion of specific male lineages in just a few generations.
Half of all men of Western European origin, for instance, descend from just one guy living 4,000 years ago. The authors of the Nature Genetics paper tried to imagine why this might have occurred: introduction of a new technology related to the agricultural revolution, perhaps. I'm doubtful that my ancestor proto-George (from the Greek Γεωργιος (Georgios) which was derived from γεωργος (georgos) meaning "farmer, earthworker") was some peaceful, humble farm boy who invented the plow. Far more likely that he was some prolific thug with a gang, or whatever passed for a gang in 2,000 BC in France or Germany. Genghis Khan before Genghis Khan, lost to history but remembered in our genes.
Tracking Viral DNA
Neanderthals aren't the only interlopers in the modern human genome. Let's look at our viral genome. One of the great discoveries associated with the Human Genome Project was the large amount of viral DNA incorporated into human DNA, fully 8 percent or the whole.
For quite some time, this book was shelved under "Junk DNA" in the genome library. But no more. Work published in Science shows that the viral DNA (from so-called "endogenous viruses") is not randomly inserted in the human genome. Rather, it clusters, and clusters around those parts of the genome associated with immune function. Here's where the story gets really interesting. Our genomes appear to have co-opted the viral invaders for the purpose of—you guessed it—fighting off other viruses.
When the authors of the paper used CRISPR/Cas9 technology to snip out endogenous viruses close to the AIM2 immune gene, it no longer responded effectively to the interferon "alarm signal" released in response to a viral infection. Immune function suffered. Endogenous viruses are the family dog who protects villagers from marauding wolves, barking loudly whenever their genetic cousins come near.
I find this thought pleasant, and somehow comforting: we've been assimilating unwanted immigrants into our chromosomal society far longer than humans have existed. Not only are endogenous viruses not "junk DNA", they are valuable, well-behaved, even life-saving first responders.
What really amazes about all this is how much we learn about our history sitting in a laboratory. Our DNA is a time machine, traveling to places no historian can reach. Dig up some old bones from a cave in Spain or Siberia and wonderful mysteries are revealed. Those old bones still matter, eons after our ancestors shuffled off this mortal plane.
One last look at the Neanderthals before we go. Trying to get inside the heads of extinct ancestors is a dangerous enterprise, but enjoyable nevertheless. Recently, far inside Bruniquel Cave in southwestern France, investigators came across a fascinating human structure. The cave's occupants had broken off some 400 stalagmites and then rearranged them in a circle. These circles show signs of fire use, with soot-blackened, heat-fractured calcite and bone remnants. Using uranium-thorium dating techniques, the circles date out to 176,500 years, long before Homo sapiens entered Europe. Neanderthal spelunkers created the structures.
What were they doing there? Did the stalagmite circles have some ritual significance, some religious meaning? Or did the Neanderthals, like modern humans gathered around a flat screen TV, just like to have a family room where they could kick back and enjoy themselves after a long day hunting bison and woolly mammoths? We'll never know, but it makes me think we might have understood each other.
Friday, May 27, 2016
One of the delights of science is how captivating the results can be. About 1.3 billion years ago, and incredibly far away, two black holes spiraled together, collided at half the speed of light, coalesced into a single black hole, and in that joining three suns worth of mass were turned into energy in the form of gravitational waves. In September of last year the sound of that collision (literally a sound: you can hear it at https://www.youtube.com/watch?v=TWqhUANNFXw) was recorded by the Advanced Laser Interferometer Gravitational-Wave Observatory's (LIGO) parallel scientific instruments in Washington and Louisiana. The waves reached Louisiana 7 milliseconds before Washington, suggesting a location in the Southern hemisphere.
The result, predicted 100 years ago by Albert Einstein, simultaneously proves the existence of black holes and launches gravitational astronomy as a new field of scientific endeavor.
Gravity is both incredibly weak and impressively strong. It is the weakest of the four fundamental physical forces in nature, some 38 orders of magnitude weaker than the strong force, 36 orders of magnitude weaker than the electromagnetic force, and 29 orders of magnitude weaker than the weak force. But at the macroscopic scale it dominates, affecting the trajectory of heavenly bodies and of humans walking around here on earth. Try jumping into the sky and see how far you get
What we know about gravity results largely from the scientific work of the two smartest humans who ever lived, Isaac Newton and Albert Einstein. Newton discovered the inverse-square law of universal gravitation, summarized in this equation:
Put another way, Newton's law of universal gravitation states (and here I quote Wikipedia) that any two bodies in the universe attract each other with a force that is directly proportional to the product of their masses and inversely proportional to the square of the distance between them.
Before Newton, gravity meant something different: something serious. But the term derives from the Latin "gravitas," which means weight, or heaviness, and this was the meaning Newton called upon. Newton's law does quite well for predicting most of what we experience, and its application to our solar system led to the discovery of Neptune in the 19th century.
This worked well enough for two centuries, until astronomers found it could not explain the orbit of Mercury. This discrepancy between theory and fact led Einstein to think about gravity, and to his field equations, principle among them:
I do not remember any of this from my sole semester of college physics in the early 1970s, but Wikipedia assures me these are true equations. I do remember the thought experiments that represented the beginning of the two great theories attached to these great discoveries: Newton's apple, Einstein's elevator.
One important consequence predicted by Einstein's equations is the existence of gravity waves, caused by distortions in the space-time continuum around large masses. Einstein himself was uncertain that such waves would ever be detected, or even whether they existed. But now, thanks to LIGO, we are certain they are real.
Gravity & Health
I don't usually think about gravity when I see my patients in clinic, though like many things I don't think about, gravity is always there in the background. It weighs us down.
Recently, astronaut Kelly Scott came back from a year spent at the International Space Station some two inches taller than when he had left. His spinal disks, not as weighed down, expanded, though they reverted as gravity reasserted itself. Scott is one of a pair of twins, allowing NASA scientists a mini-randomized controlled trial of low-G effects on human physiology.
Scott might end up shorter than when he left Earth, and certainly weaker. Astronauts lose muscle mass and bone density. Spend 6 months off-planet and you will lose 10 percent of your bone mass. Even the hearts shrink a bit while in orbit, leaving the returning astronaut faint with exertion. We evolved living at the bottom of a gravity well. Sit on top of the well and molecular adjustments occur. NASA worries about these things, as it would like to send astronauts to Mars some day.
Low gravity's molecular effects aren't perfectly well understood. That old biological workhorse (workworm?) Caenorhabditis elegans has been compared on ground and in space. One study saw down-regulation of numerous genes, whose cumulative effects prolong C. elegans lifespan. Another saw decreased accumulation of toxic protein aggregates in ageing worms' muscles. Space was made for worms searching for immortality.
There is a fascinating plastic surgery literature on gravity. Here's a fun little experiment reported by two groups (Aesthet Surg J 2014;34:809-822, and Int J Cosmet Sci 2015;37:291-7). Take patients who show signs of midface aging and have them either stand up or lie down. Take pictures while smiling and in repose. Compare "brow position, ear trough length and depth, steatoblepharon, cheek volume, malar bags/festoons, and nasolabial folds." (Steatoblepharon, by the way, is "the prolapse of fat from the orbit of the eye below the eyelid." A festoon is a chain or garland of flowers, hung in a curve as a decoration, unless you are a plastic surgeon, when it refers to significantly damaged skin in the lower eyelids.)
Then, because perception is as important as measurement if you are a plastic surgeon, have someone estimate their age by looking at the pictures. What happens?
The supine patients measure younger, and look younger. Indeed, if you are over the age of 55, and you want to look younger, never get out of bed. The simple act of lying down makes you look six years younger, on average. Gravity redistributes interstitial fluids, subtly contorting our faces. Remember that the next time you visit a funeral home and hear the magical words "doesn't he look natural?" No, he doesn't. It's just gravity. We sag when we stand. I'd rather sag, thank you very much, as long as I can keep standing.
This redistribution of interstitial fluids occurs in space as well. Fluids no longer pool in the legs, but in the upper body. The body's blood volume sensors are in the upper body, so red blood cell production declines. Anemia is yet another reason to faint when you get back to Earth.
Cancer in Space
Gravity is a boon to plastic surgeons. What about cancer?
When I treat a postmenopausal breast cancer patient with aromatase inhibitor therapy, I routinely discuss the bone loss that results from estrogen deprivation. How does one prevent bone loss? I tell my patients bone is laid down along lines of stress, and that weight-bearing exercise is among the best things they can do to prevent osteoporotic fractures. Though I never mention gravity, I silently invoke its inexorable force.
NASA has spent some small fortune performing experiments with cancer cells in space (examined in Nature Reviews Cancer 2013;13:315-27). Under microgravity conditions, cancer cells form spheroids, sometimes quite large: in one case, golf ball-size organoids of prostate cancer cells that far outpaced their earthbound cousins. Gliomas, in contrast, roll over and die in space, for reasons no one understands.
Virtually all cancer cell types undergo major structural changes in microgravity. Tubulin is assembled into microtubules by gravity-dependent reaction-diffusion processes: get rid of gravity, and the microtubules change their orientation. Would Taxol work differently in space?
So far, I've seen no rush to set up an orbital outpatient oncology clinic. Astronauts do not appear to have a higher rate of cancer incidence or mortality than the rest of us, though the numbers are small and the comparisons highly confounded from a statistical standpoint. But there is, apparently, a thriving trade in pharma-driven space experiments, with a start-up company called Space Tango contracting with around 50 customers this year to perform contract microgravity experiments in tissue-box-size automated labs.
Meanwhile, back on Earth, my ears are peeled for the next echo of two black holes colliding in some distant galaxy. Can you hear it? Let's listen together.
Monday, January 11, 2016
Recently I learned of the death of Frances Kelsey. I was somewhat surprised, not that she had died, but in the “I didn’t know she had lived that long” sort of way one has sometimes in reading the obits. Kelsey had become a national figure when I was in my early teens, at a time when John Kennedy was president, and I had not given her a second’s thought in decades.
Kelsey’s story is inextricably linked to that of thalidomide. It is an old story, and its outlines are familiar to most, but let me share it with you in case you missed it or misplaced the memory. In the 1950s thalidomide was developed as a sleeping pill and was used as such in Europe. It was particularly valued for its benefits in pregnant women. Rights to thalidomide had been sold in the United States to the William S. Merrell Company.
FRANCES KELSEY, MD, PHD
Kelsey was, at the time, a brand-new staffer at the Food and Drug Administration. Prior to this she had performed research at the University of Chicago, where she had earned both a PhD and an MD degree. When she applied to the PhD program at the University of Chicago, the acceptance letter was addressed to “Mr. Oldham” (she had been born Frances Oldham). “When a woman took a job in those days, she was made to feel as if she was depriving a man of the ability to support his wife and child,” she later told an interviewer. Fortunately, her McGill University thesis professor said, “Don’t be stupid. Accept the job, sign your name and put “Miss” in brackets afterward.” She did.
The First Task
Arriving in Chicago in 1936, she was set to work identifying the toxic agent in a batch of sulfanilamide. Initially entering the market in tablet form, the antimicrobial had been converted to liquid form for intravenous use. This new compound had promptly killed 107 patients, many of them children. The modern FDA is a stickler regarding drug formulation, and this dates to the antifreeze-like industrial chemical that had been included in the intravenous sulfa. In 1938 Congress passed the Food, Drug and Cosmetic Act in response to the tragedy.
As a new FDA staffer, Kelsey was tasked with evaluating thalidomide. She would later say that she had been assigned the evaluation not because it was considered a difficult case, but because it was a simple one—a drug already on the market, with thousands of patient-years of experience…a slam-dunk approval, a no-brainer.
Except that Kelsey was meticulous, and had a brain. In pouring through the documentation provided by the company, she was impressed by the lack of information provided by Merrell. The clinical data supplied involved little more than doctor testimonials, which Kelsey later characterized as “too glowing for the support in the way of clinical back-up.”
While the evaluation was ongoing, she noticed a publication in the British Medical Journal suggesting that thalidomide was associated with peripheral neuropathy. Merrell’s packet to the FDA had failed to mention peripheral neuropathy, immediately alerting Kelsey that something might be wrong. In those days, astonishingly, drugs were automatically approved by a certain date unless the FDA specifically disapproved them. Such approval by default seems amazing today, but those were different times. Then, as now, no one at the FDA wanted to disapprove a drug without sufficient cause.
The out was that the FDA could put an approval on hold if it requested additional information. Kelsey invoked this rule. The Merrell representative, after much grumbling, admitted that yes, there had been some reports of peripheral neuropathy, but the company felt the drug was not to blame, that it was probably related to poor nutrition, and it wasn’t all that big a deal in any event.
Kelsey still demurred and demanded real data. She and her colleagues at the FDA were particularly interested about the effects of the drug on the fetus, given that a pregnant woman might take the drug for months. Merrell reported that it did not know of any problems with the drug in pregnancy, but had not conducted a study. They were anxious to get the drug on the market, and hounded her for a quick approval. They also pressured her superiors and threatened to go to the Commissioner of the FDA. Kelsey stood firm.
But peripheral neuropathy was the least of thalidomide’s problems. While Kelsey awaited more data, reports began to emerge from Europe tying the agent to birth defects in pregnant women. The famous Hopkins cardiologist Dr. Helen Taussig (the heroes in this story are women) heard of the issue and traveled to Europe to investigate. Babies there were being born, in the thousands, with flipper-like arms and legs, a condition known as phocomelia. The drug also significantly increased the rate of miscarriage. When Taussig came home she met with Kelsey, and later testified about thalidomide in front of a House committee considering strengthening drug approval laws.
In 1962 then-president John F. Kennedy awarded Kelsey the President's Award for Distinguished Federal Civilian Service. At the award ceremony he testified, “Her exceptional judgment in evaluating a new drug for safety for human use has prevented a major tragedy of birth deformities in the United States.” Kelsey, ever modest, consistently maintained that she received the award on behalf of a team. She did not think she had done anything exceptional.
More to the Story
Kelsey’s story did not end there, though that is most anyone has ever heard of her. The outcry over thalidomide, once the story broke, led to passage of the Kefauver-Harris Amendment in 1962 to strengthen drug regulations. The bill had languished in Congress for six years prior to the thalidomide revelations.
To get some sense of how different a time 1962 was from today bear in mind that at the Senate hearing for the amendment, Senator Jacob Javits could ask the question, “Do people know they are getting investigational drugs?” The answer was “no.” Many of the U.S. participants in thalidomide trials were unaware they were receiving a drug that had not been approved by the FDA. Thalidomide changed that. The requirement for informed consent dates to Frances Kelsey. She became the first chief of the Investigational Drug Branch and created the modern process for new drug testing.
Fifteen years ago I sat on the FDA’s Oncology Drugs Advisory Committee, or ODAC. There I developed a deep and abiding respect for FDA staffers as dedicated public servants with an essentially thankless job. If the FDA approves a drug and later has to pull it from the market for reasons of some unexpected and rare but serious toxicity (think: COX-2 inhibitors), they are pilloried for not having done due diligence during the regulatory process and are accused of being stooges for the pharmaceutical industry. If, on the other hand, they delay drug approval, or reject a drug, the Wall Street Journal publishes an editorial titled—and this actually happened—“FDA to Patients: Drop Dead!” Basically, they cannot win.
I have had my own gripes with them over the years: lack of consistency and (on occasion) a seemingly poor understanding of biology. But that does nothing to diminish my respect for the FDA medical officers I have met, serious professionals who could be earning multiples of their government salary by crossing the street into industry. They are proud of Kelsey’s legacy, as they should be.
I suspect—though I do not know this to be the case—that they may sometimes invoke Kelsey’s memory to justify obstructionist behavior, just as some newspaper reporters consider themselves to be Woodward and Bernstein’s progeny whenever they publish some minor expose. But I would far rather live in a world where public servants protect the public good, just as I want a world full of investigative journalists to counter the influence of the rich and powerful.
Frances Kelsey, a tireless public servant, worked until she was 90 and died at the age of 101. She had two children; accurate enough a count if one omits the thousands of babies born with two arms and two legs because of her courage.
Tuesday, December 15, 2015
Nobel Prize season finished last week with the Awards Ceremony on Thursday, and once again the Swedes have failed to recognize my many contributions to peace, medicine, literature or physics (OK, the last one is a bit of a stretch). I watched my cell phone for days, but no phone calls from country code +46. What is with those guys?
But as happens every year, a Nobel Prize has an interesting story, and a back-story behind that story. And this story has to do with malaria and war.
The prize for Physiology or Medicine (it's quaint name) went to three investigators, with the common theme of tropical disease. Satoshi Ōmura and William C. Campbell received half of the prize "for their discoveries concerning a novel therapy against infections caused by roundworm parasites" and the delightfully named Dr. You-You Tu received the other half for her discovery of the anti-malarial drug artemisinin.
It was the malaria story that caught my attention. Dr. Tu received the prize for the discovery of artemisinin. During the Vietnam War, our Vietnamese opponents suffered horrendously from malaria. Chairman Mao, a fan of traditional Chinese medicine, encouraged studies of Chinese herbs as treatment for the disease. Dr. Tu examined the Chinese herbal tradition and found that compounds derived from the sweet wormwood plant could treat the disease in mice. She then isolated artemisinin from the herb, and an exceptionally important antimalarial was born. Today over half of malaria worldwide is chloroquine-resistant. Artemisinin is the lifeline for patients suffering from resistant malaria.
As an undergraduate I had a classmate who had served as an infantryman in Vietnam. He had come home with what I would now recognize as quartan malaria, with recurrent fevers and chills and just the most miserable feeling on earth. He had that far-away look of someone who had spent too many days seeing things 19-year-olds shouldn't see. But it was the malaria that dragged him down.
Patients often ask what I think of complementary and alternative medicines. My answer is always the same: whatever works. I prescribe drugs derived from mold, tree bark, and sea sponges. It isn't the origin of the drug that matters. If it passes rigorous scientific tests (so-called "Western medicine") demonstrating clinical utility, I am happy to prescribe it. The scientific method doesn't care if you are Chinese or American, nor whether your drug is a natural product or a synthetic chemical. It doesn't even care if it is a drug.
But the artemisinin story interested me for other reasons. The first was the long history of tropical medicine, particularly malaria, and the Nobel prize. One of the very first Nobel prizes for Medicine (in 1902) was given to Dr. Ronald Ross for his discovery of its transmission via the mosquito. Five years later Dr. Alphonse Laveran received the prize for his discovery of the malarial protozoan. Both the Englishman and the French doctor served their overseas colonial empire. In 1897, the year Ross published his work, approximately one-third of British troops in the Indian Raj were incapacitated by malaria.
Malaria represented an endemic disease in the American South for much of the first half of the 20th century. While common wisdom holds that the problem was solved through liberal dousing with DDT (another outcome of war), in fact it began its retreat during World War I. With large numbers of American soldiers receiving basic training in Southern cities, the U.S. Government was concerned that they would contract malaria before shipping overseas to France.
The government's response was to set up so-called extra-cantonment zones, areas within which it assumed responsibility for public health. Mosquito eradication, the techniques for which had been pioneered in the Spanish-American War and its aftermath, was used in a widespread fashion for the first time in the American South, considerably reducing malaria incidence and setting the stage for the subsequent New Deal-era near-eradication (pre-DDT) of malaria in the United States.
For anyone interested in this story, read the definitive account by Daniel Sledge, my political scientist son, of whom I am very proud: “War, Tropical Disease, and the Emergence of National Public Health Capacity in the United States”, Studies in American Political Development 26:125-62, 2012.
DDT also earned its discoverer, the Swiss Dr. Paul Muller, a Nobel prize in 1948. The Swiss shared Muller’s discovery with the United States in the midst of World War II, and the U.S. military rapidly introduced it into war zones, dramatically reducing deaths due to malaria, typhus, and the panoply of other insect-borne disease, not just for soldiers but for civilians.
But the larger back-story is the connection between war and scientific progress in general, and medical progress in particular. Alfred Nobel himself personifies this connection. Nobel, as is well known, made his riches through the creation of dynamite, which he hoped would be used for peaceful purposes only. His subsequent recognition of, and his horror over, the co-option of his discovery for lethal military purposes led to his creation of the Peace Prize.
Alexander Fleming's discovery of penicillin languished until World War II, when two British scientists rediscovered it and brought it forward to treat wounded soldiers, the work supported by infusions of cash from the British and American governments.
One can make too much of these connections, of course. No malaria patient cares that artemisinin came out of a nasty jungle war. No one in a doctor's office getting any of penicillin's follow-on antibiotics cares that the antibiotic revolution was a byproduct of history's bloodiest war. They just want to be treated and cured, and not re-infected.
But it seems inescapable that the link between medical progress and war is a real one. Trauma medicine had its origins in military surgeons, and even the DaVinci robot so prized by urologic surgeons caring for prostate cancer patients had its beginnings in a DARPA contest to create a battlefield-ready automatic surgeon.
And cancer patients, as we all learn early in our training, owe a debt to war: the first effective chemotherapy agents were discovered as a result of a ship full of poison gas blowing up in Naples harbor in 1944. In short order the lymphopenic sailors served as the model for treating leukemia in children.
Some day, one hopes, such expensive advances, paid in the blood of innocents as well as the gold of governments, will no longer prove necessary. But for the moment, the essential tension embodied by the Nobel prizes, with their celebration of life paid for by the archetypal merchant of death, continues to vex us.
Friday, November 6, 2015
Recently our Hematology/Oncology group held its annual retreat at Asilomar. Asilomar is a California state park and conference center on the Pacific coast, and a delightful place to hold a meeting. We’ve been going there for a quarter century or so, and it is a great place for professionals to meet away from the hurly-burly of daily existence.
Faculty, fellows and post-docs mix in a pleasant environment, learn from each other and create new collaborations. We also take long walks on the beach watching the sun set over the Pacific, or play golf at the nearby public links during our mandatory afternoon break.
We were not the only professional group there that week. Because everyone eats in a common dining hall, one gets to mingle with folks from other organizations. That weekend we were one of four groups, including the IRLSSG, the CCRH, and the OPA. If you are not up on your organizational acronyms, the CCRH is the California Coalition for Rural Housing. The IRLSSG is the International Restless Leg Syndrome Study Group, busy holding its Science Summit. “I certainly hope none of them is upstairs from me,” said one of my colleagues. “Those guys will keep you awake with all that tapping.”
The OPA, it turned out, was the Organization for Professional Astrology. They looked quite a bit different from the other groups. The women wore long, flamboyant scarves and multi-colored coats, and the men tended towards equally flamboyant facial hair. Our group was full of post-docs wearing blue jeans and T-shirts, and let me tell you, we looked pretty shabby by comparison.
I so wanted to crash their party. But duty prevailed, and I hung around with my Onco-homies instead, learning about circulating tumor DNA and novel organoid culture methods and oncogene addiction’s effects on the immune system. All very interesting, of course, but nothing to match the astrologers’ agenda for sheer pizzazz.
I found that agenda online at the OPA website: Day one was devoted to the theme of “Transition from the Cardinal Cross to the Mutable Cross.” I do not have a clue what this means, but then I suspect they might find oncogene addiction’s effects on the immune system equally mysterious. We are all fairly ignorant outside our specialties.
The breakout sessions were fascinating, and included “The Astrology of Twins,” “Astrology and Kabbalah,” “The Death Chart,” and “Unaspected Planets.” A planet is unaspected when it isn’t connected to other planets by a major aspect (Conjunction, Square, Opposition, Trine, or Sextile, as my readers will immediately recognize). Astrologers really dig astronomy. They hug it like a lamprey hugs a sturgeon, sucking it dry of meaning.
The Astrology of Twins breakout forthrightly faced a problem area for astrologists. If you seriously believe (and I have not a clue whether most astrologers are frauds or merely seriously delusional) that the location of the planets at the moment of one’s birth determines your fate—who you marry, your financial success, the date of your death—then identical twins pose a problem. They are born with the same star chart, yet demonstrably differ in terms of marital, financial, and actuarial success. They represent, one would think, a pretty telling prima facie argument against accepting astrology.
Astrologers have thought mightily about this problem. They have decided that it is not really a problem after all. First, the few minutes between one birth and the next is all you need, apparently, to affect your astrologic destiny. Plus, the astrologers say, our astrologic charts map potential, and I may choose to act out one part of my chart’s potential while my feckless twin Fred (that scoundrel) acts out another part of the chart. They believe in free will of a sort. And that explains why Fred and I marry differently and die differently.
If this all sounds like absolute nonsense, well, then, that’s because it is. Karl Popper’s definition of science comes to mind: real science doesn’t prove, it falsifies. That is to say, it sets up crucial tests of a hypothesis, and if the tests fail you move on. Popper was skeptical of Freudian psychiatry for exactly this reason: it could explain everything. There was never any serious attempt at falsification. Astrology is Freudian psychiatry on speed.
Here’s a Popperian falsification experiment, carried out in England and reported in 2003 in the Journal of Consciousness Studies (of whose existence I was previously unconscious—way too many scientific journals out there). Take 2000 Brits born in March of 1958, many within minutes of each other (so-called “time twins”). Follow them a long time, and measure over 100 characteristics, including “occupation, anxiety levels, marital status, aggressiveness, sociability, IQ levels, and ability in art, sports, mathematics and reading.”
The hypothesis would be that “time twins” should be more alike than those born further apart, under different star charts.
Surprise! No correlation. Absolute falsification. But you knew that already. The President of the Astrological Association of Great Britain, asked to comment, said the work should be treated with “extreme caution” and accused the authors of attempting to “discredit astrology.” As if that was even possible.
The OPA had a session devoted to this issue as well, entitled “Predictive Astrology.” I quote from the précis for this session: “Astrology is very good at predicting the types of experiences you will have in a lifetime but the actual events can be more difficult to pinpoint. In this workshop we will synthesize five major factors in determining experiences/events in a person’s life. The five factors are the natal promise, secondary progressions, solar arc progressions, transits, and eclipses. It may seem confusing and complicated to put all of these factors together in a birth chart but in this workshop we will breakdown each factor and look at it individually and as part of the collective whole.”
Well, OK, who doesn’t want to know more about their natal promise, and who hasn’t stayed up all night worrying about their solar arc progression? It gives me the heebie-jeebies whenever I think about it, but somehow I manage to cope.
I would have loved to have been a fly on the wall at another session. The conference organizers gave the same solar chart to four different astrologers, had them cast their charts independently, and then presented them. Howlingly entertaining, I suspect, but I was stuck learning about boring science-y stuff.
Astrology once was science-y. Johannes Kepler, in addition to his great contributions to astronomy , was imperial mathematician for the Holy Roman Emperor, and made his salary composing star charts for the Emperor Rudolf II. What’s more, he actually appears to have believed it. He wrote a friend "Regard this as certain, Mars never crosses my path without involving me in disputes and putting me myself in a quarrelsome mood." I can relate: that happens to all of us. Here’s a series of Jupiter-Saturn conjunctions from his De Stella Nova:
But science moved on, in part because Kepler and others were ultimately unable to reconcile astrology and astronomy. The astrologers, though, love to reference Kepler, and there is even a Kepler College, which supplies online education in pursuit of its “mission of offering the best astrological education available online.”
It’s easy to laugh about the OPA. It is so nonsensical, so farcical, so divorced from reality that one cannot think of OPA and Kepler College without breaking into a smile. But just as the OPA and the Stanford Heme-Onc Retreat co-existed peacefully at Asilomar, so to do we co-exist in the wider society. And not always so peacefully.
Each of those folks has the same number of votes as those of us who exist in the reality-based universe. In living memory, a President and his First Lady (the Reagans) invited an astrologer to the White House--indeed kept her on a monthly retainer.
Who knows, maybe the astrologer gave the President better advice tham the Secretary of State or the National Security Advisor. But still, it gives you pause. Donald Regan, the President’s Chief of Staff, wrote in his memoirs that the astrologer chose the dates for summit meetings, presidential debates, and (here my oncology antennae perk up) the date for the President’s 1985 cancer surgery.
The astrologer, Miss Joan Quigley of San Francisco, titled her memoir What Does Joan Say?--supposedly Ronald Reagan’s habitual question to Nancy whenever the fate of the world was at stake.
That Americans are shamefully ignorant of modern science--and not just ignorant but on occasion actively opposed to its teachings--is pretty obvious, and pretty scary. Scary, too, because scientists are now being viewed by parts of our political class as just another special interest group, to be ignored when our findings differ from the biases and interests of larger voting populations. I worry about the future of a country whose leaders glory in their ignorance.
Anyways, enough griping by an old fuss-budget scientist and physician. Our meeting broke up on Friday, but the OPA hung around another day. Saturday night, according to their website, they had a celebration, with music, a raffle with “incredible” prizes (Astrology software, which I suspect made “incredible” factually correct), free readings, and “a trip to Neptune... and much more!“
Never been to Neptune, other than with NASA’s New Horizons’ interplanetary probe. The trip isn’t in my division’s budget. And “much more”—what was that? They did look like they knew how to party. Maybe I should have hung around.