Musings of a Cancer Doctor
Wide-ranging views and perspective from George W. Sledge, Jr., MD

Thursday, June 22, 2017

In 1455, Johannes Gutenberg, a German blacksmith, published his version of the Bible. His first print run was not large—only 180 copies—but it changed the world. Prior to Gutenberg, and it is important to recognize this, there were few copies of anything. Great literary works from the ancient world might depend for their survival on some literate monk toiling away in a remote Alpine monastery. If the monk decides not to copy, say, a play by Sophocles (and we have only seven of his 123 plays), and the monastery suffers a fire, that play is gone. Gone forever. We know this happened frequently because we have the names of many of the missing works: Pliny the Elder's History of the German Wars, 107 of the 142 books that make up Livy's History of Rome, Aristarchus of Samos' astronomy book outlining (long before Copernicus) his heliocentrism theory. And then there is Suetonius' Lives of Famous Whores; no particular surprise that the monks passed on copying that.

That all changed with Gutenberg. By the end of the 15th century, within the lifetime of someone born the same year as the printing press, it is estimated that 20 million manuscripts were in print, and over 200 European cities had printing presses in operation. A century later the number is 200 million manuscripts; by the 18th century, a billion have been printed. More books meant greater literacy. More books meant more controversy. Translate the Bible into German and publish it, as Martin Luther did, and you have the Protestant Reformation; when anyone can read the Bible, anyone can form an independent opinion and priesthoods lose their monopoly of specialized knowledge. Power dynamics change dramatically when the plebes can buy newspapers.

Prior to Gutenberg, there are few scientists and they barely communicate; knowledge is arcane, hidden, and easily lost when the scientist dies. After Newton, scientists start publishing their work, in Latin for easy transmission, and the Scientific Revolution takes off. The world is suddenly a very different place, and all because Gutenberg combined moveable type with a wine press.

And people changed as well. Or, more to the point, their brains change. Consider this recent experiment, conducted in India and published in Science Advances: take an illiterate 30-year-old rural Indian woman and teach her to read. Perform sophisticated brain imaging pre- and post-literacy. What does one see on the scans?

Something quite interesting. The colliculi superiores, a part of the brainstem, and the pulvinar, located in the thalamus, change the timing of their activity patterns to those of the visual cortex. And the more closely aligned the timing of brainstem and thalamus, the better one reads. Don Quixote and Pride and Prejudice hijack something very old, something reptilian, in our brains. 2D-printing made population-wide literacy possible, but it also reprogrammed the brains of millions of people.

Gutenberg lived in the German city of Mainz. If you were living in Mainz during Gutenberg's life, the big news was not the creation of the printing press. Instead, you would have been obsessed with the Mainz Diocesan Feud, a conflict over who would assume the throne of the Electorate of Mainz. Totally obscure today, the Diocesan feud resulted in the sack of Mainz by Adolph of Nassau (one of the contenders) and his troops. Gutenberg, by now an old man and failed businessman, was exiled along with 800 of his fellow citizens. He was one of the lucky ones: sacking a city rarely went well for its inhabitants, and hundreds of his fellow citizens were murdered.

I imagine Gutenberg trudging out of his home town, just one among many refugees caught up in the bloody politics of his time. Was he thinking about the Diocesan feud, or was his mind leaping ahead to the revolution he created, to the billion books, to the free flow of information across a world hungry for knowledge? Adolph of Nassau, or Martin Luther and Isaac Newton?

Today the verdict is easy: no one remembers Adolph of Nassau, and Gutenberg is one of civilization's greats. But it never looks that way when you are living through it. We rarely, in real time, understand what is important. Four centuries from now, I suspect the exploits of current potentates will fade, and what we will remember is the way science has transformed the world.

Gutenberg was performing two-dimensional printing. Now we have 3D printing, and it is on everyone's short list of world-changing technologic advances.

My old colleague Dr. Wikipedia defines 3D printing (AKA "additive manufacturing") as "processes used to create a three-dimensional object in which layers of material are formed under computer control to create an object." Basically, one creates a computer model (in what is called an STL file) of a 3-dimensional artifact. The STL file is then processed by software called a slicer—which does just what it sounds like—and converted into a series of thin layers. These layers are applied by a machine (the printer) repetitively. The layers themselves are now down to 100-micron resolution, a number that continues to drop.

3D printers have found their way into medicine, particularly with the development of cheap, individualized prosthetics: do a CT scan or MRI, use it as the template for the 3D printer, and roll out a prosthetic jaw or hip. After the 3D printer is paid for—and the price of these is collapsing—the prosthetic becomes quite inexpensive, developing world-cheap. Break a bone? You can now 3D print a personalized cast. Do you want a urologist to practice on "you" before the actual kidney surgery for a renal cell cancer? Perform a CT scan, 3D print the kidney model, and use it to plan surgery, minimizing normal tissue loss and preserving kidney function.

But creating new knees or prepping for a difficult operation, as important as they may be, is just the beginning. 3D bioprinting is now coming along. And this is genuinely wonderful. Start with a gel or sugar matrix, layer mixtures of cells on the matrix in repetitive, sequential fashion, and before you know it you have a functioning organ. It's already been used to create cartilage for worn-out joints, synthetic bone filler to support bone regeneration, and patches of heart muscle to assist recovery after a heart attack. (For the last see this cool video of 3D-printed muscle: https://www.youtube.com/watch?v=4VqIiqZtkU&feature=youtu.be). While still mostly at the preclinical stage, biotech startups are beginning to get into the business, and clinical trials are not too far off. Regenerative medicine will be quite different for the next generation of doctors and patients. Do you have heart failure? Just order a new one using your own stem cells printed into a cardiac matrix.

I've emphasized the medical aspects of 3D printing because, well, I'm a doctor. But 3D printing has so many uses, potential and real, that its only real limits are those of the human imagination. Suppose, for instance, you want to set up a human colony on the moon or Mars. A major barrier is the cost of moving things from here to there: a gravity well is not the space-travelers friend. You can't ship replacement parts for everything, and you can't even know what you will need 6 months from now. Things happen, unpredictable things that require some weird widget. But ship up a 3D printer along with the specs for just about any device, toss in some lunar dust, and you are in business. I'm not making this up: see Jakus, AE et al. Robust and Elastic Lunar and Martian Structures from 3D-Printed Regolith Inks. Scientific Reports 2017;7:44931.

And remember how we stopped losing ancient books once Gutenberg came along? We live in a world where art survives at the mercy of religious fanatics armed with AK-47s. 3D printing is still fairly new, and the polymers being used are crude simulacra of the real thing, but we are probably not that far away from a day when every home can have an exact replica of the Mona Lisa on the wall, and where ISIS could no longer destroy Palmyra's Temple of Bel because there are a hundred identical copies scattered around the world. So maybe they aren't the same thing, exactly, but wouldn't it be wonderful to have the Met's Temple of Dendur in your back yard? Sound crazy? Large, industrial-scale 3D printers are already being used to make houses in China.

I've described a fairly rosy picture for 3D printing: cheap knee replacements, smarter surgery, new hearts, better-equipped Moon colonies, my own copy of Monet. But there's another side as well. One of the iron laws of new technology is that it will always be used for purposes of porn and violence. 3D-printed sex toys—I'll leave the products to your imagination--are now available on the Internet. And 3D-printed firearms, essentially invisible to airport security, are available to any zealot with a 3D printer. Illegal in most localities, including the U.S., but when did that ever stop anyone? And would you want your teenage neighbor to have the 3D specs for a nuclear device?

There are other social and economic implications. If I can download a file that allows my home 3D-printer to replicate the most sophisticated of devices at negligible cost, whole industries are at risk. As a teenager, I read a prescient science fiction story (you can still find it on the Web) called "Business as Usual During Alterations." The story's premise was that aliens introduce technology allowing replication of virtually any product, in an attempt to destroy Earth's scarcity-based economy. This assault fails because the new technology, while eliminating the economies of scale underlying the 20th century industrial economy, unleashes human creativity and emphasizes diversity over uniformity. In the 21st century, we are the aliens, and the old industrial economy may well vanish, indeed is vanishing in front of our eyes. Already, for instance, there is a 3D shoe company in San Diego that produces individualized shoes—an exact fit, no more corns—on demand.

It's reasonable to ask whether 2D and 3D printing bear any real relationship. One, after all, is all about words, the other about physical objects. But the written word and hand-made objects are both uniquely human constructs, claims for tool-making, quasi-talking animal relatives notwithstanding. And remember the illiterate Indian peasant woman? If 2D printing reprograms the brain, what will 3D printing do? We may soon find out. Imagine a continent—let's call it North America—where every kindergartner is taught 3D programming along with reading and writing. Will that child's brain function differently than yours or mine? I imagine our view of the spatial environment changing a great deal.

Back to Gutenberg. The story has somewhat of a happy ending. Three years after the sack of Mainz, Adolph von Nassau allowed Gutenberg to return to Mainz. Gutenberg was given the title of Hofmann (gentleman of the court) along with appropriate court dress, a stipend, and 2,000 liters of wine. Perhaps our friend Johannes died a happy man, if not a sober one. Perhaps—and I hope that this is the case for at least some contemporary leaders as well—von Nassau was not a flaming narcissist and actually understood who was the real big deal in 15th century Mainz. But that may be a very two-dimensional, overly optimistic, way of looking at things.​


Thursday, May 25, 2017

Wrangel Island is a small, miserable place in the Arctic Ocean, a land where the temperature stays below freezing for 9 months per year. It has few permanent residents (a Russian weather station and a few park rangers), though the odd scientist drops in every now and then. Some 3,700 years ago, had you visited the place, you would have seen a strange sight. This is where the last woolly mammoths roamed, and died.

Their forbearers had long since died out on the mainland, likely the result of an invasive species that had spread from Africa to Northern Eurasia and the Americas. Wherever that species (we call them "wise man" in their Latin formulation) went, large animals died with frightening suddenness. But Wrangel Island, separated from the mainland by rising waters some 12,000 years ago, and not exactly what one would consider prime real estate, remained a refugee for those last mammoths.

Elsewhere, human beings had already created the first civilizations and were writing epics in Akkadian. But on Wrangel Island, the woolly mammoths were declining. Species confined to islands frequently undergo reduction in size over time, known as insular dwarfism. Think of Shetland ponies, or the "Hobbits" found on Flores Island off the coast of Southern Asia, or dwarf mammoths in the Channel Islands off the coast of France. The Wrangel Island mammoths do not appear to have been insular dwarfs, but they were genetically stressed.

We've learned this as a result of genomic analyses comparing the Wrangel Island mammoths with their mainland ancestors, as recently published in PLoS Genetics. Small in number—perhaps as few as 300 breeding members in that sad remnant—they were severely inbred and had accumulated numerous genetic defects, with significant loss of heterozygosity, increased deletions affecting gene sequences, and increased premature stop codons. The paper's authors call it "genomic meltdown."

Their hair, for instance, was not the coarse dark hair of their woolly mammoth ancestors, but rather a soft cream-colored, satiny coat lacking an inner core and therefore deficient as an insulator. One can imagine them standing there shivering as Arctic blasts bore down on Wrangel Island. Did this lead to their eventual extinction, or did our ancestors, visiting in boats, do them in? We don't know for certain, though harpoons and other human artifacts on the island have been dated to ~1700 BC. For whatever reason, the Wrangel Island woolly mammoths went extinct. We will never see their like again.

Or maybe we will. George Church at Harvard has inserted woolly mammoth genes into Indian elephant cells using CRISPR technology. The list of edits is up to 45 genes, genes that would allow a tropical elephant to survive on the Siberian plains. Similar efforts are underway in Russia and

Korea. Perhaps elephant-mammoth hybrids will walk the Earth in a few years. The Russians have already established a Pleistocene Park.

I feel conflicted about this. Part of me, the eternally curious 15-year old part of me that loves gee-whiz science, finds this attempt at de-extinction (a neologism of the CRISPR age) a thoroughly enthralling prospect. The other part of me, rather like Jeff Goldblum's character in Jurassic Park, finds all this quite frightening.

And frightening, not for the reason stated by Goldblum's character ("they had their time, and their time is over" was the gist of the argument), but because of the technology itself. For with CRISPR technology, we are at an inflection point in human history, that point where humans intentionally, directly alter the genomes of insects, mammals, and fellow humans.

CRISPR technology has advanced rapidly in the past few years. I wrote about this in a previous blog, but let me briefly revisit the technology. CRISPR is a bacterial defense system designed to snip out viruses attacking the bacterial genome. The CRISPR/Cas9 system has been adapted to mammalian cells and allows one to subtract or insert highly specific gene sequences. If one does this with stem cells, one can permanently change the genetic makeup of an organism. It works, sort-of, in human ova.

CRISPR/Cas9 has gone from obscurity to ubiquity in an astonishingly short time. Since the Doudna and Charpentier breakthrough paper in 2012, here are the number of PubMed citations: three in 2012; 78 in 2013; 315 in 2014; 715 in 2015; and 1,465 in 2016. As I write this in mid-April, 689 CRISPR/Cas9 papers have already been published in 2017. The technology is cheap, easy to learn, and powerful to use. It is a true new paradigm, an overused term but appropriate for once.

The big questions right now are:

  • When will CRISPR's discoverers/inventors fly to Stockholm?
  • Which discoverers/inventors will get the Nobel?
  • Who will win the patent wars over the use of CRISPR/Cas-9 in mammalian cells?​

I don't know the answer to the first question, other than "soon," nor the second question (several folks have a claim), though the answer to the third question, according to a recent U.S. Patent and Trademark Office decision, is "the Broad Institute will get very rich for adapting other people's ideas." The lawsuits will probably land up at the Supreme Court.

Lots has been happening in this space of late that has nothing to do with patents. I've mentioned the plan to resurrect the woolly mammoth, but consider the mosquito. I hate mosquitoes, and I've never met anyone who speaks well of them. They seem to exist only to render us miserable. It is not enough that they ruin warm summer nights with their incessant buzz and their pernicious bites. They transmit malaria, a disease that afflicts tens of millions and kills many of those. Humans evolved, in Africa and the Mediterranean, to resist the ravages of malaria. But in a "the cure is worse than the disease" sense, this evolution left us with sickle cell disease and its hematologic relatives.

So why not get rid of malaria-transmitting mosquitoes? One can use CRISPR/Cas-9 technology to create mosquitoes that are resistant to P. falciparum, the malarial parasite. That would be pretty cool. The problem is that if you dump a bunch of them into Sub-Saharan Africa, they will be outcompeted by the huge native mosquito population. In no time at all, the beneficial effects of the engineered population will dissipate and eventually disappear.

The scientific solution is called "gene drive" (a fine TED talks explanation can be seen at https://www.youtube.com/watch?v=OI_OhvOumT0). Gene drive involves the use of natural "selfish" homing endonuclease genes to promote the inheritance of specific genes. And it turns out you can use CRISPR/Cas9-based gene drive technology to engineer in genes that render mosquitoes resistant to malaria. Or—and this has also been done—you can engineer in a gene that renders female mosquitoes sterile.

Released into the wild, these engineered mosquitoes could spread throughout the world, eradicating the parent mosquito stock. One calculation suggests that were you to dump enough of these engineered mosquitoes into Sub-Saharan Africa, you could wipe out virtually all of the malaria-transmitting mosquitoes in the course of a year. An inflection point in human history: using science to drive an infection-transmitting, if unloved, species to extinction and replacing them with kinder, gentler relatives, albeit ones that still bite you at your neighborhood barbecue.

The engineered mosquitoes are waiting patiently in their inventor's lab. They've not been released into the wild. Lots of ethics committees lie ahead. Others are studying ways in which we might use gene drive to eliminate invasive species and return entire continents to their pre-invasion ecologic purity.

Meanwhile, those who guard us against things that go bump in the night are cogitating madly away on this. Last year, the latest Worldwide Threat Assessment of the U.S. Intelligence Community listed it as a potential bioterror weapon of mass destruction.

If you can engineer a mosquito tribe out of existence, what mad science CRISPR experiment might be used to, say, drive New England Patriot fans to the brink of extinction? I know, I'm sorry, bad example, but I'm sure you can concoct a more morally dubious use for the technology without great difficulty.

Sledge.jpeg

Does this sound too science-fictiony to you? Too like the movie, Gattaca, where the genetically engineered abuse the remnant normal human population? Too improbable? I'd like to think so, but this year's Intel Science Talent Search second-place awardee (with a $75,000 prize and an acceptance to Harvard) is an 18-year-old kid named Michael Zhang whose prize-winning science fair project is called "CRISPR-Cas9-based Viruslike Particles for Orthogonal and Programmable Genetic Engineering in Mammalian Cells." Just wait until next flu season. And I don't even want to know what the Talent Search's First Prize was for; Second Prize sounded scary enough.

The National Academy of Sciences (NAS) and the National Academy of Medicine have just released a thoughtful report on genome editing (you can, and should, see it at https://www.nap.edu/catalog/24623/humangenome-editing-science-ethicsand-governance). Not 2 years ago, a group of scientists that included CRISPR co-inventor Jennifer Doudna strongly discouraged the use of CRISPR technology on humans. The new report now gives a green light to genome editing for serious heritable medical conditions, under properly controlled, rigorous criteria. The same committee argued against genome editing for what they termed "enhancement," which of course is what I really want.

While we are a bit off, from both a technical and regulatory standpoint, from human genome editing, the NAS report opens the door on a new era in human history. The Israeli author Yuval Noah Harari recently published a fascinating book called Homo Deus: A Brief History of Tomorrow. Homo Deus makes the case that humanism has become the new religion of mankind, and that for the first time in our history as a species we have almost God-like powers to alter our own existence. Homo sapiens, if it does not succeed in cleansing the Earth of itself through stupidity, may transform into something unimaginable. And if that happens it will look back on this decade as the turning point.

I'd still like to meet up with a woolly mammoth.


Monday, April 10, 2017

Usually it happens in some committee meeting, or an all-day conference. You've been there, I'm sure. A speaker gets up, and starts talking about a thoroughly uninteresting subject. Or maybe an interesting subject that he makes thoroughly uninteresting. He drones on, and on, and on, and at some point you cannot take it anymore. You are bored, painfully so, and you want to get up and run screaming out of the room, but of course you were bred for courtesy in the face of monotony. You drift into something approaching catatonia, eventually relieved by the shuffling of papers and the movement of chairs as those around you break for lunch. Life is short, and ennui has claimed more of your precious remaining hours.

What I find fascinating is that the concept of boredom is so new. The word did not exist in English until Charles Dickens published Bleak House (a novel where lawsuits drone on and on) in 1852. He must have loved the neologism, for he used the word six times. It caught on. But most languages now have a word for boredom: in French, ennui; in German, langeweile; in Dutch, verveling. I am sure boredom existed before 1852, just like the world had color before the invention of the color TV, but why no word for the concept? Did it require the repetitive motions of the Industrial Age, that 19th century phenomenon? Was the world more interesting then? The past is a foreign country; they do things differently there, as L.P. Hartley wrote.

Boredom has a (small, pretty boring) medical literature. There is, for instance, a Boredom Proneness Scale (BPS), which I refuse to bore you with, which allows one to quantitate your degree of boredom. The BPS is, I read with no particular enthusiasm, in the process of being replaced by the Multidimensional State Boredom Scale. You might think that confining boredom to just one dimension would be a good thing, but apparently not.

Defining boredom is problematic, though psychologists have tried. The best scientific definition comes down to something like "an unfulfilled desire for satisfying action." Think of a caged lion, pacing back and forth for hours. Psychologists have defined four types of boredom: indifferent, calibrating, searching, and reactant. A recent advance in the field was to add a fifth type, apathetic boredom. Sometimes the researchers' ability to descend into self-parody is impressive: Can we make boredom studies more boring? Answer: yes, let's add apathetic boredom to the list.

Boredom researchers hold an annual conference at the University of Warsaw. I'll keep going to ASCO, thank you very much. There have been remarkably few boredom proneness studies inflicted on cancer patients. For what it's worth, treating terminal bored (terminally bored?) patients with the antidepressant citalopram makes them measurably less bored.

Boredom has been studied in several neurologic and psychologic disorders. Patients with anterograde amnesia, of the sort experienced by Drew Barrymore's character in the movie 50 First Dates, rarely get bored. Everything is always new. Alzheimer's disease, surprisingly, appears associated with staggering amounts of boredom, at least in the early stage of the disease. Boredom is associated with (in no particular order) drug addiction, compulsive gambling, eating disorders, depression, alcoholism, ADHD, and poor grades. Indeed, a drug addict's ability to beat his addiction appears related to his degree of boredom. Traumatic brain injury patients are easily bored. Auto accidents are more common among the bored, which of course leads to brain injury.

Speaking of injury, boredom is particularly a problem in the workplace. While some souls prefer mindless repetitive activity, most of us are not wired that way. Faced with the prospect of placing widget A in hole B for 8 straight hours, an assembly line worker may drift off, with dangerous medical consequences: boredom is a major underlying cause of workplace injury. I don't see this happening often in the medical clinic, though a colleague started snoring in the midst of a particularly boring patient interaction. I'm sure this happens all the time to analysts.

Boredom has been studied, after a fashion, in animals. I don't need an animal psychologist to understand that dogs get bored, as any time spent around a sad-looking basset hound will attest. I know it is unwise to project human emotions onto our canine friends, but dogs are so finely attuned to human emotions that it would be surprising if they did not recognize, and mirror, our boredom. I've seen dogs start yawning with their masters, and even vice versa. Cats go ka-ka if not continuously stimulated by their human servants. There's always a danger in anthropomorphizing animal behavior, of course, but pets and farm animals sure act bored on a regular basis.

From an evolutionary standpoint, boredom seems to require some relatively high level of intellectual functioning. We cannot interrogate Fido with a Boredom Proneness Scale, so there's a paucity of formal animal boredom studies. Researchers at the University of Guelph in Ontario published one interesting experiment in which they separated two groups of captive mink into either "enriched" or bare, sparse, boring cages. They then approached both groups with new stimuli, including ones that mink normally find scary. The mink from the unenriched cages were quicker to approach new objects, even the scary ones. Objectively speaking, they get bored and seek stimulation.

Boredom clearly has a developmental aspect. Anyone with kids recognizes the one or two-year-old's delight with the world, the absolute joy and sense of wonder they experience with every new thing. In contrast, teenagers are the world champions of boredom. Particularly around their parents and teachers, whom they seem to think were put on this earth to bore them to death. Older folks (well, my age) report lower levels of boredom. I'm so over that.

"Bored to death," by the way, is literally true. The British Whitehall II study asked civil servants questions about the degree of boredom they experienced. Two decades later, the most bored civil servants experienced the highest death rates. Bored people are meaner, too: if asked to recommend a prison sentence for a criminal, bored pseudo-jurors recommend harsher punishments: "Throw the book at him, your honor, I'm suffering from ennui." And people who are bored at the Department of Motor Vehicles are less likely to register as organ donors. On the other hand, since bored people are more likely to perform risky behaviors, they're more likely to become organ donors. Should transplant surgeons hold dance competitions at their local DMV? Or not? Public policy questions are always so tough.

Psychologists have started to scan the brains of the bored. They first have someone watch a video known to induce boredom, and then perform a functional MRI scan. The bored have specific changes in the default mode network area of the brain. To quote the 2016 paper in Experimental Brain Research, "Boredom represents a failure to engage executive control networks when faced with a monotonous task—in other words, when the task demands some level of engagement, but is so mundane that attempts to do so fail."

Nice to know that my boredom has physiologic correlates. Usually, when I am in the midst of a boring all-day meeting, my mind wanders, often so far afield that new ideas start popping into my head. I scribble down notes: to do lists, ideas for blogs, research projects, you name it. Boredom, I suppose, can be the starting place for creativity. There is, of course, a literature attached to this phenomenon. If you randomize a group either to a boring task (copying phone numbers out of a telephone book) followed by a creative task (finding uses for plastic cups), or to the creative task without any preceding period of boredom, the induced boredom group is more creative.

Anyways, my attention is wandering, so I think I'll stop writing and go see that latest Hollywood big budget blow- 'em-up extravaganza. You know, the one where the aliens train a boredom ray on the earth and we're saved by a two-year-old who happens to have gotten ahold of a nuclear missile.​


Tuesday, February 28, 2017

Looking through my files, I find that I gave 13 lectures in 2016, or just over one a month. Using that number as a rough annual average, this would imply somewhere over 400 lectures during the course of my career, though this could be off by 100 or so, plus or minus. Each of those lectures involved several hours of prep time, and some few involved tens of hours of preparation, so I've devoted a fair amount of my life to their creation and presentation.               

My colleagues who teach "real" medical school courses would scoff at these efforts, of course, but some of these talks were defining moments of my career, and therefore important to me, even if others barely noticed. Being an academic physician means many things, but one commonality involves flying somewhere, standing behind a lectern, and proceeding through a PowerPoint presentation.

In doing so, we represent part of a long chain that stretches back centuries. The very first medical school, the Schola Medica Salernitana, dates to sometime in the 10th century. While very little is known about its origins, at its peak between the 10th and 12th centuries the Schola Medica Salernitana was the center of learning for medicine in the Western world. Legend has it that the school was founded by a Greek, a Jew, a Latin, and a Muslim. Medical schools have always been cosmopolitan places, and Salerno, located on the Tyrrhenian Sea in the south of Italy, stood at a crossroads between civilizations.

Constantine Africanus, born in North Africa and a convert to Christianity, came to Salerno in 1077 and translated ancient Arab medical texts—themselves translations from even older Greek texts—into Latin, forming the basis of the curriculum in Salerno. In the image below, he is lecturing to 11th-century medical students.


Medical students often don't bother to attend lectures these days, due to the ubiquity of note-takers and the filming of talks. This is nothing new: some of our first manuscripts from Salerno represent notes taken by students and then sold on to others. Skipping class also has a long, if undocumented, lineage.

Some of the most boring hours of my life were spent sitting in lecture halls. There are few things worse on this earth than a turgid, remorselessly dull lecture. Hell, if it exists, is probably full of lecture halls. How much did you learn from years of lectures, compared to the number of hours spent listening to them? The return on investment of time is undoubtedly low, which is why those who study knowledge acquisition rarely have kind things to say about medical pedagogy.

But things could be worse. In fact, they were. Throughout much of the Middle Ages, anatomy classes involved three individuals: a lector, who sat on a pulpit and read from an anatomy text; an ostensor, who pointed to the body parts being dissected, and a sector, who did the actual dissections. The problem, aside from the boring bit about being read to from a Latin anatomy text, was that the text was wrong. Galen, the ancient Roman father of anatomy, had done his work on animals, not humans.

Andreas Vesalius of Padua, the first modern anatomist, was important not only because he dissected humans, but because he got rid of the tripartite lector-ostensor-sector thing, lecturing as he dissected. In the image below, he is surrounded by his students.


Well, they won't let me bring tumors to my talks. Instead, we use the moderm paraphenalia of lectures. The speaker stands behind the lectern (from the Latin lectus, past participle of legere, "to read"), which in turn is situated on a podium (from the Greek pod- for foot, since you are standing on it), a raised surface that allows the speaker to look down at the audience. The lectern serves several purposes: the speaker can place notes there, he can grip the lectern for support, and (if broad enough at the base) it can hide knocking knees of first-time speakers.

Lecterns have their own dangers. At one ASCO event, the lectern was placed near the back of the stage, for some reason. One speaker fell off the dais, saving the audience (statistically speaking) 15 minutes of boredom.

For those speakers who prefer not to hide behind the lectern, the conference organizers may offer the use of a lavalier mike, a small microphone clipped to the tie or coat, attached to a transmitter.

Why it is called a lavalier microphone? The term comes from the jewelry business, where a lavalier is a pendant suspended from a necklace. And why do jewelers call it a lavalier? In memory of the Duchesse Louise de la Valliere, mistress of King Louis XIV of France. She bore him five children and ended her life in a convent, but before that she made the eponymous jewelry popular. I like to think, as a regular user of the lavalier mike, that Louise did not take a vow of silence when she entered the convent. Or maybe she did, if she lectured the King using PowerPoint presentations.

I am dating myself, but I remember when slides were slides. Slide shows have their own history. The first slide shows were in the 1600s when hand-painted pictures on glass were projected onto a wall using a so-called "magic lantern". These, in turn, became the "lantern slides" used by 19th century speakers. By the early 20th century photographic images replaced painted images on glass, and then in 1936 the introduction of 35mm Kodachrome film led to the standard 2 X 2 inch slides that I used as an assistant professor.

These were, in contrast to current practice, "real" slide shows. I have difficulty making oncology fellows who grew up on PowerPoint understand what these involved. Today, speakers work on their talks right up until they walk on stage, but in my youth one prepared talks weeks in advance. You had to. One took carefully typed pieces of paper or pictures to the Medical Illustration department, where they were photographed and slides prepared, at an often-substantial price. Towards the end of the 1990s I gave a plenary lecture at ASCO that was projected on five different screens, requiring five sets of slides. The data came in late. I remember it costing $600.

These slide shows could be adventures. If the slide projector overheated the slides might warp or melt. Or, if the Kodak carousel top came loose, your entire talk might fall out. I saw this happen once. The speaker picked up the slides, placed them back in the slide carousel in no particular order, and gave his talk in no particular order. Hilarity ensued. Ah, those were the days.

My plenary session talk was virtually the last one I gave using "real" slides. Shortly thereafter PowerPoint presentations became standard, almost simultaneously with the internet. Now I drag images off the web (someone somewhere on this earth has already made the point I want to make), never visit Medical Illustration (if they still exist), and go on my happy way.

PowerPoint has its detractors. PowerPoint templates lead to the dreaded "Death by PowerPoint" talks with their unimaginative, sterile, repetitive slides that pollute Grand Rounds around the world. Microsoft has sold hundreds of millions of copies of PowerPoint software, which in turn have led to trillions of slides.

Was this what Constantine Africanus had in mind when he first got up to speak at the Schola Medica Salernitana? I don't know, but I suspect, given the rate of technologic change, that later 21st century lecturers will tell their junior colleagues that "when I was an Assistant Professor we used 'real' slide shows, something called PowerPoint."

How will they lecture? Some three-dimensional extravaganza? Direct porting of knowledge to embedded cerebral implants? Will lecture halls disappear entirely, turned into office space or new laboratories? Will all the talks be virtual, rather than in-person? But that's a talk for another time.


Tuesday, October 25, 2016

Every now and then I see an article trumpeting "the oldest" this or that—sometimes human, sometimes a non-human vertebrate, sometimes a tree. The "oldest human" competition is one you never actually want to win. It seems something akin to "longest stretch on death row": an achievement, no doubt, but not one that you would feel comfortable owning.

Still, these stories have their own charms. Recently, some Danish biologists anointed the Greenland shark as "the oldest living vertebrate." Greenland sharks, as their name suggests, hug the coast of Greenland in the North Atlantic and Arctic Ocean. Why they do not, like Canadians in January, migrate to the Florida Keys is a mystery. What science knew about them until quite recently was limited: they are quite long (up to six meters) and they grow at a glacial pace (a few centimeters at most per year) and swim quite slowly (1.22 km/h). Katie Ledecky would have no trouble swimming faster than your average Greenland shark.

In an attempt to get at the age of the Greenland shark, investigators carbon-14 dated a shark's eye lenses. Why the lens? Because Greenland shark lenses do not change after birth. To their very great astonishment, they came up with a life expectancy of 392 +/- 120 years. There may be Greenland sharks swimming in the Arctic that were born around the same time as William Shakespeare.

An interesting side note: Greenland sharks have high concentrations of trimethylamine oxide and urea in their flesh. The trimethylamine oxide (an adaptation to living in deep waters) is converted to trimethylamine when you eat a Greenland shark, and can make you quite drunk. Kæstur hákarl, an Icelandic delicacy made from the Greenland shark, smells like urine and has been described by Anthony Bordain as "the single worst, most disgusting and terrible tasting thing" he had ever eaten. Also, Greenland sharks don't reach sexual maturity until they are 150 years old. Or maybe they just can't stand the smell of each other.

So, to summarize: if you are a Greenland shark you live in the darkest, coldest water on the planet, you smell bad, you move very slowly, and you cannot get a date until midway through your second century. Why would you want to live four centuries? Because you can, apparently.

The World's Oldest

The "oldest living" list for animals includes: tortoise (250 years), lizard (117 years), and koi (226 years). Vertebrates don't match up all that well against invertebrates, alas: a clam named Ming lived 507 years, and would have lived longer except that, as a delightful USA Today headline stated, "Scientists accidentally kill world's oldest animal."

Note to old folks: stay away from them scientists.

Trees do even better than animals in "the oldest living" competition. The winner for "oldest tree" is a Norway spruce named "Old Tjikko" that lives in the harsh terrain of Sweden's Dalarna province. It clocks in at 9,550 years, though it is something of cheat for "world's oldest" in that it is a vegetatively cloned tree. The tree we see is relatively young (the tree trunk only lives for 600 years or so) and is a sprout off of the ancient root system. Old Tjikko is named after the dead dog of its geographer discoverer, not that it matters.

A Bosnian pine holds the European record for non-clonal trees at ~1,075 years old. The oldest non-clonal tree in the world is said to be a bristlecone pine tree from California's White Mountains named Methuselah that is around 5,000 years old. There was an even older one named Prometheus that (to quote a Mental Floss article on old trees) "was accidentally felled by a scientist who didn't realize the tree was as old as it was." I'm beginning to see a pattern here: per William Blake, "We murder to dissect."

Forever Young?

What all of these "world's oldest" seem to have in common is that they live in the world's barren places: mountains, deserts, the bottom of the Arctic Ocean. Not much grows there, and what grows does so at a leisurely pace. A Nature paper from 1999, entitled "Ancient stunted trees on Cliffs," found that vertical cliffs around the world "support populations of widely spaced trees that are exceptionally old, deformed and slow growing." These trees have radial growth of less than 1 mm per year.

Stunting your growth, then, is a good thing if you want to live forever. That, at least, seems to be the lesson from natural history studies. But it is also true at the mammalian end of the spectrum. Starve mice of calories and they live significantly longer.

Caloric restriction is probably a good idea for human as well, though it is unlikely to double one's survival, let alone allow a Methuselah-like existence. I've written, in previous blogs, about anti-senescence drugs (senolytics) and blood transfusions from the young as ways of warding off death. The latter derives from experiments in which a young mouse and a senescent mouse are sewn together, a process known as parabiosis, with resultant de-aging of the old guy. Now, it seems, such approaches are becoming popular among the wealthy.

Peter Thiel, a Silicon Valley tech investor best known for speaking at Donald Trump's Republican National Convention earlier this year, is an investor in Ambrosia. Ambrosia is a Monterey, Calif., company that has started a trial called "Young Donor Plasma Transfusion and Age-Related Biomarkers." As trials go, it is unusual: you pay the company $8,000 to participate. For anyone between the ages of 35 and 80, participation involves receiving the blood of plasma donors ages 16 to 25.

I looked up Ambrosia's trial (NCT02803554, in case you are interested) on ClinicalTrials.gov. The trial design is pretty simple: draw some blood, infuse plasma, draw some more blood a month later.

Lots of blood. The blood drawn is examined for biomarkers of aging, including—and this is only about a third of what is being tested—Interleukin-5, Interleukin-6, Interleukin-7, Interleukin-8, Leptin, Macrophage Inflammatory Protein-1 alpha, Macrophage Inflammatory Protein-1 beta, Macrophage-Derived Chemokine, Matrix Metalloproteinase-2, Matrix Metalloproteinase-3, Matrix Metalloproteinase-9, Monocyte Chemotactic Protein 1, Myeloperoxidase, Myoglobin, Neuron-Specific Enolase, Plasminogen Activator Inhibitor 1, Prostate-Specific Antigen, Free, Pulmonary and Activation-Regulated Chemokine, Serum Amyloid P-Component, Stem Cell Factor, T-Cell-Specific Protein RANTES, Thrombospondin-1, Thyroid-Stimulating Hormone, Thyroxine-Binding Globulin, Tissue Inhibitor of Metalloproteinases 1, Transthyretin, Tumor Necrosis Factor alpha, Tumor Necrosis Factor beta, Tumor necrosis factor receptor 2, Vascular Cell Adhesion Molecule-1, Endothelial Growth Factor, Vitamin D-Binding Protein, von Willebrand Factor.

You pretty much need the plasma infusion to survive the bloodletting. The clinical trialist in me says that the next time I drive to Monterey it will be to visit the aquarium. And would shooting up with a 20 year-old's plasma really be all that useful? What if the plasma you needed was from a 6-month-old? Ambrosia is looking to recruit 600 patients: we'll see how they do.

Ambrosia, you will remember, was the nectar of the gods in ancient Greece. Our modern gods appear terrified by the prospect that their billions will not suffice to render them similarly immortal, hence their interest in anti-aging technology. Meanwhile, another Silicon Valley company, Alkahest, is performing the PLASMA (PLasma for Alzheimer SymptoM Amelioration) Study with neurologists at Stanford. According to ClinicalTrials.gov, this one has more modest aims, being primarily a feasibility study with a few dementia metrics thrown in as secondary endpoints. Let's hope it works. I'm burning through cortical neurons at an alarming rate.

Anyways, modern science is now in hot pursuit of anti-aging strategies. Whether the pursuit of Greenland sharkdom or Ming the Clam or Methuselah the bristlecone pine is worthwhile I will leave to your philosophical imagination. I tend to side with Gilgamesh of Uruk on this:​

What you seek you shall never find.

F​or when the Gods made man,

They kept immortality to themselves.

Fill your belly.

Day and night make merry.

Let Days be full of joy.

Love the child who holds your hand.

Let your wife delight in your embrace.

For these alone are the concerns of man.