Skip Navigation LinksHome > Blogs > Musings of a Cancer Doctor
Musings of a Cancer Doctor
Wide-ranging views and perspective from George W. Sledge, Jr., MD
Monday, June 23, 2014

I recently read Dave Eggers’ brilliant and frightening novel The Circle, his modern take on Orwell. In 1984 the totalitarian state was the enemy of freedom, a hard tyranny motivated by the state’s desire to crush all opposition. Eggers thinks we are headed for a much softer tyranny, one dominated by do-gooder Silicon Valley business types determined to eliminate—indeed, undermine the justification for—any attempt at privacy. 1984 has come and gone; the world of The Circle careens towards us at high speed.

 

The sphere of privacy continues to shrink. If it even is a sphere anymore; sometimes it seems shriveled to a point that is no longer three dimensional, almost a bodiless mathematical concept. Hardly a day goes by without some new assault, or potential assault, finding its way into the news. Some examples are from the scientific literature, and some from current politics, but the vector is always the same: towards an era where privacy ceases to exist.

 

Much of this is technology-driven. That technology affects privacy rights is nothing new. Previous generations had to deal with wiretapping and listening devices, after all. But this seems both quantitatively and qualitatively different.

 

Here are a few I’ve collected in the last year. What impresses is the sheer variety of attacks on privacy, coming from every degree of the compass:

 

·    An article in PNAS demonstrates that your Facebook "likes" could do a pretty good job of defining your sexual orientation (85% predictive ability for male homosexuality), race, and political identity.

 

·    The FBI recently announced that it had deployed drones on American soil four times in recent years. Police departments are considering doing the same in American cities, supplementing the now ubiquitous CCTV cameras.

 

·    A group of Harvard engineers recently created a quarter-sized “fly”, a robotic insect capable of controlled flight (for a first look at the future look at http://www.youtube.com/watch?feature=player_embedded&v=cyjKOJhIiuU). Give it some “eyes”, deploy a few hundred, and then imagine some hovering outside your bedroom window. Flutter, flutter, buzz, buzz. Is that a mosquito I just slapped, or some investigator's eyeballs?

 

And while I could not exist anymore without Google, the company has been the great destroyer of privacy. If you want to know someone's life history, it is probably out there somewhere on a server farm. Try a little experiment: Starting with a colleague's name, see how much personal information you can find in an hour. Next, imagine how much of that information you could have found in an hour a decade or two ago. You would be amazed.

 

Maybe that's a good thing: it depends on who's searching, and why. If you don’t want to date a convicted felon, going online is a good way to avoid that particular mishap. But it is, of course, much more than just that. Every little thing about you can end up out there. When I was a teenager we used to laugh at school administrators who claimed that some minor infraction would “become part of your permanent record.” Not anymore: the record is not only permanent, it is global. In some important way there are no second chances anymore.

 

Google (don't be evil) is responsible for other transgressions. We now know that Google StreetView collected more than pictures of your front door: they went behind the door to collect passwords, email, and medical and financial records, as the company admitted in court. The court fined the company a half-day of its profits, which I suppose tells you how much the courts value your privacy. Google also promised to go and sin no more, which I know will reassure us all. And they will now require employees to take the computer industry equivalent of a HIPAA course. All right, I admit, cruel and unusual punishment.

 

These Internet assaults are layered on top of other privacy intrusion: ubiquitous CCTV’s following you down city streets, your cell phone movements describing where you have travelled with great precision, your credit card history demonstrating a financial phenotype, your checkout at the grocery story generating targeted advertising on the back of the receipt, and your web history now the technological equivalent of a trash can, to be sorted through by private investigators digging for dirt. Sometime in the last decade privacy became a fossil word. The expectation of privacy is disappearing from public discourse.

 

About the only data that doesn't want to be free is personal health information, and that only because accessing it is a felony. How long before technology effectively eliminates medical privacy? For the moment, at least, medicine remains Privacy Island. Sort of, anyway: all around us the sharks circle, waiting to strike. Don’t step too far off the shore.

·    An article in Science showed that one could, by merging publically available “de-identified” data from the 1000 Genomes project with popular online genealogy data, re-identify the source individuals.

·    If a patient accesses a free online health website for information on “cancer” or “herpes” or “depression”, according to Marco Huesch of USC, that information will, on average, be shared with six or seven third-party elements.

 

I used to think that conspiracy theorists were crackpots, with their ideas of the government tracking us via RFID-encoded greenbacks through airport security checkpoints. It turns out that the crackpots were right, though wrong about the purveyor (the TSA? Seriously? The dysfunctional guys who still can't stop people from carrying box cutters through security checkpoints?).

 

It’s much closer to home than that, as I’ve recently discovered. Hospitals are taking a deep dive into something called RTLS, or Real Time Location Services. I found out about this technology (as usual, I am the last to know) through service on a committee overseeing the building of a new hospital. RTLS places RFIDs on every large object (in RTLS-speak, “assets”) as well as on patients and hospital employees. It can then localize all of these to within three feet, via carefully placed receivers scattered throughout the building. Several companies (Versus, CenTrak) have sprung up to commercialize this technology.

 

These devices have real uses. They keep large, expensive pieces of equipment from walking out the door, or at least allow you to track which door they walked through. Recently a hospital in Munster, Indiana suffered an outbreak of MERS. Because hospital employees were tracked through Versus RTLS technology, the hospital was able to hand CDC investigators a list of everyone who had come in contact with the patient. Sounds great, right? An epidemiologist’s dream.

 

Perhaps. The cynic in me says that sometime in the near future some hospital administrator will use the technology to decide, and chastise, employees who spend too much time in the bathroom. Administrators just can’t help themselves.

 

RTLS is part of the larger “Internet of Things”, or IoT as it has come to be called. A recent article in The Economist suggests that by 2020 some 26 billion devices will be connected to the Cloud. Consider IoT’s potential. Will I end up with a brilliant toaster, one that knows exactly how long to brown my bread? A mattress that diagnoses sleep apnea? A lawn mower whose handles sends my pulse and blood pressure to my internist? A microwave oven that refuses to zap hot dogs when I go over my fat gram budget for the day? Maybe I’ll be healthier. I’ll certainly be crankier: the soft tyranny of the IoT will drive me off the deep end in a hurry.

 

The bottom line, though, is this: we are entering an era of ubiquitous monitoring, and it is here to stay. I wish I was (just) being paranoid. Here are the words of a recently released government report entitled Big Data: Seizing Opportunities, Preserving Values: “Signals from home WiFi networks reveal how many people are in a room and where they are seated. Power consumption data collected from demand-response systems show when you move about your house. Facial recognition technologies can identify you in pictures online and as soon as you step outside. Always-on wearable technologies with voice and video interfaces and the arrival of whole classes of networked devices will only expand information collection still further. This sea of ubiquitous sensors, each of which has legitimate uses, make the notion of limiting information collection challenging, if not impossible.”

 

Discussions on privacy are frequently cadged in terms of crime or terrorism: wouldn't you prefer to give up another bitty little piece of your soul, a loss you would hardly notice, to avoid another 9/11 or even an armed robbery? The security state, adept at guarding or criminalizing the release of information harmful to its reputation, cannot imagine its citizens deserving or even wanting the same protections. One is reminded of Ben Franklin’s prescient statement: “They who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.”

 

But government intrusions are becoming almost irrelevant. I doubt the TSA or NSA or FBI or CIA or any other of the alphabet soup of government agencies really cares very much about my Internet history. It is, rather, the Pacific ocean of data being collected, and the certainty that it will be shared with, well, everyone. Remember the words of that government report? “This sea of ubiquitous sensors, each of which has legitimate uses, make the notion of limiting information collection challenging, if not impossible.” Challenging, if not impossible: I’ll opt for impossible.

 

And will we care? I doubt it, so long as the data theft remains quietly in the background, being used for purposes no more nefarious than designing advertising strategy or predicting sales.


Monday, May 26, 2014

Over time I have accumulated numerous defects. When I was four years old I hypothesized that I could balance on top of a large beach ball. I was wrong, and in proof of the null hypothesis received several stitches over my left eyebrow. When I was in fifth grade I broke a finger playing football. Six weeks later the splint came off. I went outside that same afternoon, played football, and broke another finger on the same hand. My GP looked at me with mild disgust and said, "You clown," having used up all his compassion with the first fracture. The following year I assaulted a plate glass window with my left knee, leaving another scar.

 

In high school I spent a day helping my dad put shrubs and trees--a veritable forest of shrubs and trees--into our back yard, after which my back went into spasm. I have never had it imaged, but I am pretty sure I herniated a disc. Every two or three years it reminds me of that day in my teens. Joyce Kilmer obviously never spent an afternoon planting trees, or he would have found them less charming. 

 

I was blessedly free of further physical insults for nearly a decade. In my fellowship I had a varicocoele ligation: more scars. Early on as an Assistant Professor I found that I had lattice degeneration, a precursor lesion for retinal detachment. I was told not to worry about it unless I found myself going blind suddenly, and not to play football without a helmet. I haven't gone blind yet, but my football days are over. On a recent exam I was told that I had early cataracts. And, by the way, I do not appreciate being labeled a degenerate by an ophthalmologist.

 

And don't even get me started on my teeth. A bunch of my adult teeth never came in, a condition known as hypodontia. I kept several of my baby teeth well into my 40s before they all eventually fell out, to be replaced over time with dental implants. According to Wikipedia, hypodontia is statistically linked to epithelial ovarian cancer, which I guess is interesting. As to etiology, Wikipedia says “the condition is believed to be associated with genetic or environmental factors during dental development.” Duh. Anyways, some defects you accumulate, some you are born with.

 

I've accumulated a number of benign neoplasms and the like: nevi, warts, a few small colon polyps (resected times 2 per colonoscopy). I think I have an early actinic keratosis, some lichen planus, and a few other nonthreatening-looking skin nodules that I am sure are named after someone or have some Latin name known only to dermatologists, their verbal equivalent of a secret handshake.

 

A couple of winters ago, on Christmas Eve, I ran into a pillar in the foyer of my house: more stitches, this time above my right eyebrow, more scar tissue. Shortly after that I caught my heel on a splinter while climbing up the basement stairs. I hobbled around for a week or two, self-medicating, until I had sufficient scar tissue to regain unhampered mobility. 

 

Of course it's the defects you accumulate internally that you worry about. Once you get past the skin or gastrointestinal mucosa it's hard to detect them with ease. My cholesterol is up there without a statin, and my glucose level has suggested from time to time that I may be prediabetic. Coronary arteries, carotid arteries, pancreas? Which one is lying in wait to do me in? The problem with getting older is that you take increasingly frequent journeys to Comorbidity Island, accumulating defects like some crazy recluse collect cats.

 

Below the tissue level, the medical oncologist in me always worries about the accumulation of genetic defects. Our DNA is under constant assault: one paper I read, looking at a mouse model, estimated somewhere in the range of 1500 to 7000 DNA lesions per hour per cell. Most of them don’t take: we’re fitted out by nature with excellent DNA damage repair mechanisms.

 

Still, it gives you pause. Some of those mutations slip through. Over time they add up. Some combination of mutations in some cells eventually results in cancer, as we all know. But there is also a theory of aging (there are lots of theories of aging) implicating the progressive accumulation of genetic defects for the loss of organ function as we grow older. Take your muscles, for instance. The older you get, the more mitochondrial DNA mutations you accumulate. At some point, muscle fibers lose cytochrome C oxidase activity due to these mutational defects. Your muscles rot.

 

This accumulation of mutational defects is quite organ-specific, at least in mice: Dolle and colleagues, writing in the PNAS over a decade ago, showed that the small intestine accumulated only point mutations, whereas the heart appears to specialize in large genome rearrangements. They spoke of “organ-specific genome deterioration,” a term that does not give me warm and fuzzy feelings.

 

And these mutations start accumulating at a very early point in life. Li and colleagues at McGill studied identical twins in a recent Journal of Medical Genetics paper. Identical twins are that great natural experiment that allows us to ask “just how identical is identical?” Looking at genome-wide SNP variants in 33 pairs of identical twins, they estimated a mutation frequency of 1.2 X10−7 mutations per nucleotide. This may not seem like much, but given three billion base pairs in the human genome, they concluded “it is likely that each individual carries approximately over 300 postzygotic mutations in the nuclear genome of white blood cells that occurred in the early development.”

 

Our cells are all potentially mutinous traitors, held in check only by the determined efforts of the molecular police force. We carry the seeds of our ultimate demise from birth.

 

Well, that’s all very cheery, you say. We all know that we grow old, and then we die, and the growing old part is only if we are lucky. This isn’t exactly late-breaking news. Applying a molecular mechanism to mortality is interesting, but my hair is still getting grayer by the day. Do something!

 

And something cool they are doing. Villeda and colleagues just published a paper in Nature Medicine looking at cognitive function in old mice. Mice, like humans, get duller as they get older. But give the old mice a transfusion of plasma from young mice, and you reverse age-related cognitive function. This has a somewhat vampirish sound to it: the old guy rejuvenating after sucking the life force out of some sweet young thing. But apparently Hollywood got it right.

 

Before I start lining up medical student “volunteers” for blood transfusions, can we drill down a bit on what is actually happening? Two new papers in Science suggest that the ”young blood” factor responsible for the cognitive improvement is GDF11, a circulating member of the TGF-b family. GDF11 increases neurogenesis and vascular remodeling in the brain. It also reverses cardiac hypertrophy in old mice. Unsurprisingly, the lead investigators are already speaking to venture capitalists to commercialize the findings.

 

It’s always a leap from mouse to man, and certainly a danger to assume that a single molecule will be a sovereign cure for all our ills. But I’ll sign up for the Phase II trial. I could use some neurogenesis right now.

 

Nothing in life is free, of course. The dentate gyrus, part of the hippocampus, generates neurons throughout adult life, and these adult neurons are incorporated into pre-existing neural networks. That’s a good thing. But it comes at a price: a paper just published in Science also shows that adult hippocampal neurogenesis promotes forgetting.

 

So, the dilemma we may face a few years from now: you are starting to lose your memory. You take GDF11 or whatever the very expensive synthetic equivalent is called in 2024. It promotes neurogenesis and you get a new lease on life. But you forget your spouse’s birthday in the process, and maybe some of what makes you “you.” Is the new “you” really you any more?

 

One is reminded of the Ise Grand Shrine in Japan. This beautiful Shinto wooden temple is rebuilt to the same design every 20 years, using trees from the same forest. Is the 2014 temple the same temple as the 1014 temple? Last year represented the 62nd time the temple was rebuilt. Ise is forever new, forever ancient. Maybe we will be too.

 

But I doubt it. We’ll still keep accumulating defects. And, being a drug, GDF11 will do something bad and unexpected. But it should be fun to find out.


Monday, May 12, 2014

In my last post (4/10/14 issue) I reviewed recent advances in non-cancer genomics, particularly the fun stuff involving Neanderthals and human migration. Genomic studies are rapidly filling in important blanks in human macrohistory, blanks for which there are no written records and only the scattered bony remains of long-dead ancestors.
 
In this post I'd like to discuss some microhistory, specifically the history of human cancers. We are writing this history as we speak, with almost every issue of Nature and Science and Cell and their  lesser relatives. This re-write of microhistory, like that of macrohistory, is the glorious outcome of a decade of falling prices for genomic analysis. While we don't know everything yet--when do we ever?--we are accumulating so much new data at such a rapid pace that every current cancer textbook is seriously out of date the moment it is published.

 

The last year has given us both 20,000-foot views of the genome as well as up-close-and-personal disease-specific stuff.

 

Two excellent papers published last year looked at the “20,000-foot view” of cancer. Writing in Nature, Cyriac Kandoth and colleagues in The Cancer Genome Atlas (TCGA) project looked at point mutations and small insertions/deletions across 12 tumor types (2013;502:333-339), overall finding 127 significantly (i.e., driver) mutated genes; most cancers have two to six drivers on average. Some of these (particularly transcriptional factors/regulators) tend to be tissue-specific, and some (such as histone modifiers) are relatively more ubiquitous.

 

Some mutations (for instance, NRAS and KRAS) seem mutually exclusive: overall there are 14 mutually exclusive pairs. Mutational pairings are common (148 co-occurring pairs), but some pairings are disease-specific (IDH1 and ATRX in glioblastoma, TBX3 and MLL4 in lung adenocarcinoma). There’s probably some interesting biology going on related to these pairings and non-pairings--matches made in hell, perhaps. Some mutations are tumor-specific (NPM1 and FLT3 in AML), but most are not.

 

Lawrence and colleagues at the Broad Institute performed a similar analysis across 27 tumor types in an attempt to examine genomic heterogeneity. Their analysis, published in Nature last year (Nature 2013;499,214-218), focused not on driver mutations per se, but on overall somatic mutation frequencies. There are huge differences in mutational frequency, both within and across cancer types: more than three orders of magnitude from the least to the most mutated.

 

There are also significant differences in the types of mutations seen, differences that appear to reflect (in part) tumor etiology. Cervical, bladder, and head and neck cancers, for instance, share Tp*C  mutations that may reflect their viral etiology. GI tumors tend to share frequent transitions at CpG dinucleotides. Etiology, in turn, also affects frequency, with highly mutagenized tumors occurring as a consequence of tobacco, hamburgers, and tanning salons.

 

Analyses such as those of Kandoth and Lawrence have focused (quite naturally) on genes coding for proteins (the exome). But most DNA is non-coding, what we used to call “Junk DNA.” Khurana and colleagues at the Sanger Institute identified some 100 potential non-coding cancer driving genetic variants in breast, prostate, and brain tumors.

 

Khurana’s paper suggests we’ve barely scratched the surface of genomic analysis. I am reminded of Isaac Newton’s fetching phrase (I’ve quoted it in a previous blog, but it appears appropriate to the non-coding DNA story):  I was like a boy playing on the sea-shore, and diverting myself now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.” But the important thing is that we have, in recent years, dipped our toes into that great ocean of truth.

 

A whole series of individual cancer genomes are now being rolled out for public display, and the results continue to disentangle cancer-specific biology and (perhaps) suggest new tumor attacks. A recent publication by Ojesina and colleagues in Nature (2014; 6,371-375) identified (as one example among several) HER2 somatic mutations (not the standard breast cancer-type amplifications) in six percent of cervical cancers.

 

Will these offer new treatment options for these patients? We will see soon, I suspect. We are seeing a profusion of low-frequency mutations in relatively rare diseases. Running the numbers, it’s hard to imagine performing Phase III trials to establish benefit for all of these. What’s a clinical trialist to do? How should the regulators deal with the tiny Phase II trials that will result? How are patients (and payers) to understand cost and benefit equations arising from those trials?

 

Some of the more fascinating studies reported in the past year explored the clonal evolution of human cancers. Sohrab Shah and colleagues in Vancouver analyzed 104 early triple-negative breast cancers. To quote their Nature article (2012:486:395-399):  “TNBCs vary widely and continuously in their clonal frequencies at the time of diagnosis… At the time of diagnosis these cancers already display a widely varying clonal evolution that mirrors the variation in mutational evolution.”

 

There is a point in Jurassic Park where a seasoned big game hunter, busy pursuing a nasty-looking velociraptor, suddenly finds himself outmaneuvered by the predatory dinosaur’s concealed partner. His last words before being chewed up are “Clever girl.” That’s the way I felt after reading Shah’s paper. Clever girl: TNBC will be a long, hard slog of a disease if we remain stuck in a morass of kinases.

 

Similar to the TNBC paper, one by Landau and colleagues investigated the evolution and impact of subclonal mutations in chronic lymphocytic leukemia. This work, published last year in Cell (2013;152:714-726), was both fascinating and deeply disturbing. Measuring mutations at multiple time-points, they showed that chemotherapy-treated tumors (though not untreated CLL) underwent clonal evolution, picking up new driver mutations (and increased measurable genetic diversity) that expanded over time, and that these drivers were powerful predictors of outcome. “Never bet against Charles Darwin” is a good rule of thumb in biology, just like “Never bet against Albert Einstein” is a good physics rule.

 

First Steps Toward Clinical Utility

While most of the excitement in cancer genomics has revolved around the large scale projects such as The Cancer Genome Atlas Project and its kindred studies, projects that in essence serve to describe the mutational landscape of human cancer(s), we are now beginning the first steps towards clinical utility in cancer genomics.

 

My institution, like many around the country, has wrestled with how to “bring genomics to the masses." The actual performance of tumor deep sequencing has gotten relatively inexpensive (relatively, that is, to those versed in the cost of an hour in an emergency room), and almost technically trivial. But running an assay is only the first step on the journey.

 

How does one analyze the data from an analysis of three billion base pairs? How does one do it in a timely fashion (patients appropriately want treatment today, not eight weeks from now)? Fortunately these technical challenges are becoming easier to handle: a recent report from the University of Chicago (published in Bioinformatics) demonstrated the use of supercomputers to process 240 genomes in two days.

 

But analytic problems are more than just technical challenges. They involve real judgment calls. How does one decide what is an important driver mutation, as opposed to a multitude of “passenger” mutations? When is a mutation actionable?

 

Indeed, what does “actionable” even mean? In current genomics-speak, “actionable” appears to mean something along the lines of “We found a mutation. That mutation is associated with badness in some disease. There is a drug that does something to interfere with that badness in some way in a preclinical model or some other cancer somewhere in the scientific literature.”

 

“Actionable” is easily misunderstood by patients and referring physicians as “we have a drug that works for your tumor.” What it really means is “maybe using this drug is not a total shot in the dark.”

 

Commercial enterprises are now leaping blindly into this evidence chasm, and asking us to jump in along with them, paying their development costs. Like all lovers’ leaps, I guess you just have to take some things on faith. But some of us would prefer, before ordering a test, to experience the cool soothing balm of carefully curated data demonstrating that obtaining the test actually affects outcome. Old-fashioned, I know.

 

How to obtain that evidence is the subject of a great deal of noodling by clinical/translational researchers. The upcoming NCI Match trial, a multi-thousand patient attempt to match mutations with drugs in an integrated, cross-disease clinical trials platform, will launch later this year, to much genuine interest and excitement by both patients and physicians.

 

Most cancer genomics to date has been performed on well-curated, carefully prepared, highly cellular tumors. Can we turn genomic analysis into a blood test? We are now beginning to see publications in the “liquid biopsy” field, such as last year’s New England Journal of Medicine publication by Sarah-Jane Dawson and her Cambridge colleagues (2013 ; 368:1199-1209).

 

To cut to the chase, we can now measure specific genomic alterations (Dawson measured PIK3CA and TP53) in human plasma, and those alterations can be used to track the course of metastatic disease with sensitivity higher than that of circulating tumor cells or protein tumor markers such as CA 15-3. This is a good beginning, though only a beginning: measuring badness is necessary but insufficient to ameliorating badness, as the recent S0500 breast cancer circulating tumor cell trial demonstrated. This technology will really take off when we can use it to identify actionable (there I go again, but you know what I mean) mutations.

 

All in all, we’re constantly being reminded that we live in the Golden Age of cancer biology, a breathtaking rollercoaster ride filled with the excitement of discovery.  Have you ever read a book you loved so much that you hated for it to end? One where you wanted to keep on reading after the last page to find out what happened to all those fascinating characters? We’re reading that book, the book of Nature, right now. But don’t worry: we’re still only somewhere in the middle. Maybe even the first few chapters. There’s plenty left to learn.


Monday, March 03, 2014

I thought that there could not have been any better year for genomics than 2012, but the last year or so has matched it. The falling price of genomics, the development of new analytic techniques, and the inventiveness with which genomic technologies are applied have colluded in the creation of wonderful new insights. There is hardly an area of biology or medicine that hasn't benefited in some way.

Let's look at the study of human origins. We learned, just a few years ago, that most non-sub-Saharan moderns carry a touch of Neanderthal in their genome. This discovery is largely the work of a group led by Svante Pääbo at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Pääbo has also found evidence of other admixtures (the mysterious Denisovans who make up part of the modern genomic complement of Pacific islanders). 

 

It would have appalled our Victorian ancestors, as well as the many 20th century racial ideologues wedded to a belief in the purity of their particular ethnic group, to learn of this unwelcome ancestry. But I found it a charming thought.

 

If 1.5% or so of our genome come from Neanderthal ancestors, the interesting question is "which 1.5%?” Is it a random set of genes, or, as Darwinian theory might predict, a set that increased our biologic fitness? A first-pass look at our Neanderthal inheritance suggests the latter: definitely non-random, definitely selected for (or against). Two recent publications (in Science and Nature) suggest that there was positive selection for genes involved in skin and hair -- particularly the type II cluster of keratin genes on 12q13 -- suggesting that our “Out of Africa” Homo sapiens ancestors benefitted from Neanderthal cold climate genes. It turns out that when we say that someone is “thick-skinned” we are implying Neanderthal ancestry.

 

At the same time, there was significant negative selection for X chromosome (a five-fold reduction) and testes-specific genes, as well as reduction of Neanderthal DNA in a region of the genome containing a gene thought to play an important role in human speech and language. The authors suggest that human-Neanderthal hybrids had reduced fertility, resulting in elimination of the X chromosome genes causing the infertility.

 

Well, OK, but so what? Why should anyone really care, other than in the “gee whiz” way, which scientific curiosities of no particular current importance appeal to intelligent onlookers? Because, once again, William Faulkner was right: the past isn’t dead. It isn’t even past. The researchers identified nine variant genes associated with specific disease-linked traits, including (to quote the Nature paper) “alleles of Neanderthal origin that affect lupus, biliary cirrhosis, Crohn’s disease, optic-disk size, smoking behaviour, IL-18 levels and type 2 Diabetes.”

 

Many of my breast cancer patients dread the idea that they have passed on a mutation to their children that will doom them at some future point. The sense of guilt can be overwhelming, that fear of a dark cloud hanging over those they hold most dear aligned with the certainty that they were the cause of their children’s future grief. I tell them that it is certainly not their fault, any more than it was their parents or their parents’ parents. And the good things they have given their children -- life, love, home, and family -- certainly far outweigh the bad. But this view of the things they pass on rarely seems to assuage the guilt, nor the sadness that attends it.

 

BRCA mutations may date back a millennium or two. But to extend the disease-causing mutation story back several tens of thousands of years (the Neanderthals became extinct some 30,000-plus years ago, and the interbreeding is thought to have occurred 40-80,000 years ago) is just astonishing. 

 

Try explaining to a lung cancer patient that he couldn’t quit smoking because of a gene he inherited from an extinct species: doomed by your inner cave man, if you will. Or, perhaps, some weird cosmic revenge for wiping out the Neanderthals?

 

Human history is rife with examples of one population violating the space of another, but we rarely think of these interactions in genomic terms. A new analytic technique called Globetrotter changes that. Produced by researchers at Oxford University, and published recently in Science, Globetrotter uses modern human genomes to identify ancient population migration patterns. The authors call these “admixture events,” a polite term for what was often a very messy, very ugly history. You can take a look at these at the Globetrotter website at http://admixturemap.paintmychromosomes.com

 

The authors reasoned that if someone from population A hooked up with someone from population B, then their offspring would share the genetic history of both parents. And then, over time, genetic recombination would occur, allowing one to clock how long ago the genomic mash-up happened. A fairly simple idea, but an enormous amount of thought and work went into making Globetrotter a reality.

 

A decade ago studies of the Y chromosome suggested that a relatively large percentage of selected Asian populations descend from Genghis Khan. Globetrotter confirmed this analysis, looking, not at the Y chromosome, but at approximately a half-million SNPs throughout the genome in some 1500 individuals from around the world. The Mongol Horde was not some group of peace-loving picnickers camping out on the Silk Road: Genghis and his kids were very, very bad people, and we have the genetic evidence to prove it.

 

Globetrotter also defined several other admixture events. To quote the paper: “We identified events whose dates and participants suggest they describe genetic impacts of the Mongol empire, Arab slave trade, Bantu expansion, first millennium CE migrations in Eastern Europe, and European colonialism, as well as unrecorded events, revealing admixture to be an almost universal force shaping human populations.” Maybe those unrecorded events were peaceful ones, but the ones we know about were not. Universal force, indeed.

 

These are first revelations from Globetrotter, and no doubt others will emerge. It seems pretty clear that much of prehistory (and a fair amount of history) may need to be re-written in the next few years. Could anyone have predicted this synchronicity of history and biology a couple of decades ago? Will any legitimate historian, going forward, be able to ignore science? And, as the Neanderthal data suggests, will we MD’s be able to ignore the lessons of anthropology?

 

There were a few other recent genome-based history lessons worth mentioning. In 2012 one of the coolest stories involved the discovery of Richard III's bones under a parking lot. I wrote a blog post about it at the timeand the Shakespeare lover in me still finds this ineffably awesome. Now comes the suggestion that the bones of King Alfred the Great, or perhaps his son, have also been found. The evidence here is less compelling (unlike the Richard III story, there are no known modern descendants to compare genomes with) but still intriguing.

 

I note in passing that the Sledge family’s ancestral bones remain to be discovered. I’m not holding my breath. But the news from a 12,600 year-old burial site in Wilsall, Montana gives me some hope. Looking at the genome derived from the only human burial associated with the Clovis culture (makers of the beautiful Clovis point arrowheads), the investigators (a University of Copenhagen group publishing in a recent issue of Nature) demonstrated that some 80% of all present day Native Americans are direct descendants of the individual’s closest relatives. Though not, alas, of the boy buried in Wilsall: he died shortly after his first birthday.

 

There’s always something wonderful about these genome-based historical studies. You feel connected, knowing that men and women walking by you on the street today carry genes from the family of a child buried over 12,000 years ago in Montana, or from ancestors who met each other when Homo sapiens met Homo neanderthalensis somewhere in the Levant. And that someday we too will be links in the great chain, passing on (if we are lucky in the genetic game of chance) our own genes to some unimaginable future.

 

In my next post I’ll write about what we’ve learned about cancer genomics in the last year. It may not be quite as much fun as Neanderthals, Globetrotter, and Clovis culture genomics, but it is consequential and interesting. Our own cells pass on things as well: individual microevolution as opposed to global macroevolution.


Tuesday, February 18, 2014

My favorite movie of 2013 will not win an Oscar. Entitled “A Boy and His Atom,” it is directed and produced by IBM scientists, and is a totally charming short celebrating nanotechnology. You actually view individual atoms being moved around. Not big on plot, but delightful nevertheless, it can be seen on YouTube at: http://www.youtube.com/watch?v=oSCX78-8-q0

 

Nanotechnology is one of those unfulfilled promises, made years ago, that is just so compelling that we keep giving it second (and third, and fourth, and fifth) chances. It has to come true at some point. Just what that point is, and just how it will manifest itself, is uncertain, but virtually all agree that something will come of it.

 

What do we even mean by nanotechnology? A few years ago albumen-bound paclitaxel was introduced to the world as the first nanotechnology drug. If slapping some egg white on Taxol is nanotechnology, then I’m the King of Siam: really quite ridiculous, but “Abraxane is nano” has had an astonishingly long run. “Doxil is nano” as well: fat droplets = nanotechnology. I’m not convinced that putting a drug in a different wrapper really qualifies, but what do I know?

 

Part of the problem is definitional. The Google definition (“the branch of technology that deals with dimensions and tolerances of less than 100 nanometers, esp. the manipulation of individual atoms and molecules.”) is fine, but basically only tells you that really small stuff is nanotech. What does “manipulation” mean? My fingers aren’t 100 nanometers.

 

I think of nanotech in the way that Nobel laureate Richard Feynman did. His foundational talk, titled “There’s Plenty of Room at the Bottom,” was presented in 1959, and still delights. In it he speaks of “a very wild idea, it would be interesting in surgery if you could swallow the surgeon. You put the mechanical surgeon inside the blood vessel and it goes into the heart and ‘looks’ around. (Of course the information has to be fed out.) It finds out which valve is the faulty one and takes a little knife and slices it out. Other small machines might be permanently incorporated in the body to assist some inadequately-functioning organ.”

 

Feynman went even further: “But I am not afraid to consider the final question as to whether, ultimately – in the great future – we can arrange the atoms the way we want; the very atoms, all the way down!” The great future was well within the lifetime of some in his audience, witness to the escalating speed of technology. Witness “A Boy and His Atom.”

 

That’s what I think of when I think of nanotechnology: really little machines. “Swallowing the surgeon” eventually became the concept of nanobots injected to fix molecular defects. The futurist Ray Kurzwell speaks of an era (he predicts 2030) when we are loaded with billions of these tiny machines, which communicate with the Cloud, and are endowed with what he calls “a measure of intelligence.” They will put medical oncologists (and many other specialists) out of business through preventative maintenance. In theory, I suppose, I may live long enough to benefit from these nanobots while still drawing on my pre-nanobot medical pension. Sweet!

 

Kurzweil takes something like 150 nutritional supplements per day in hopes of delaying his death long enough for the nanos to render him immortal, which suggests he either knows a great deal more or a great deal less than I do about nutritional supplements. He recently went to work at Google, which is beginning to get into the immortality biz. You can see Kurzweil interviewed by the Wall Street Journal here: http://blogs.wsj.com/cio/2014/02/04/googles-ray-kurzweil-envisions-new-era-of-search/

 

The best way to know that something is real in medicine, rather than some pipe dream, is when there’s a medical journal devoted to it. And, of course there is: it’s called Cancer Nanotechnology. I went and looked at its table of contents. The first article I espied was “Pharmacokinetics and biodistribution of negatively charged pectin nanoparticles encapsulating paclitaxel.” You know pectin: the agent you add to jam or jelly to help them thicken. Pectin. Really? Nanotechnology? Maybe I am the King of Siam.

After medical journals, your next best bet in seeing whether something is real is to measure government response. The NCI’s Nanotechnology Characterization Laboratory performs nanomaterial safety and toxicity testing in vitro and in vivo. To date it has evaluated over 250 potential nanomedicines, several of which are making their way to the clinic.

 

One likes to think of nanotech as essentially benign. But the long-term safety of nanomedicines is one of the great imponderables: there just haven’t been any studies. When I was an intern, a cardiology fellow shared some folk wisdom with my team: “If you haven’t had any complications, you haven’t done enough procedures.” True then, true now. My bet is that we’ll discover some new side effects.

 

And it’s not just nanotech side effects at the individual level that technologists worry about. Consider the “Gray Goo” scenario much beloved by nanotech Cassandras and science fiction authors. In the Gray Goo scenario, originally described by the nanotechnologist Eric Drexler, self-replicating von Neumann machines consume all of the matter on Earth while dividing uncontrollably. All that’s left of us and everything else is a large ball of Gray Goo. Yuck.

 

Realistic? Maybe yes, maybe no. If it is, get me out of here: some idiot dictator will release it by intent, or some lab tech by mistake. The Feds certainly take it seriously: the Department of Defense had a $426.1 million nanotechnology budget in 2012. The nanotech budget was less in 2013, but that may be sequestration.

 

The National Nanotechnology Initiative website says the “DOD considers nanotechnology to have potential to contribute to the warfighting capabilities of the nation… The DOD also invests in nanotechnology for advanced energetic materials, photocatalytic coatings, active microelectronic devices, and a wide array of other promising technologies.” I don’t worry about photocatalytic coatings (Pectin, anyone? Egg white on your drone?) But “active microelectronic devices” and  “other promising technologies” are probably what we need to look out for.

 

Back to medicine. One of the cooler new technologies, just firing up but already attracting attention, are ADARs (adenosine deaminases acting on RNA). ADARs allow one to edit RNA transcripts, correcting mutations at the transcript level. A recent paper (Montiel-Gonzalez et al. PNAS 2013; 110: 18285-90) showed the potential of this technology to correct a mutation causing cystic fibrosis. It’s early days yet (cell line works, not yet at the organism level), but a star-struck molecular biologist writing in the New England Journal of Medicine referred to the approach as “an RNA-repairing nanomachine within the limits of current technology.”

 

So maybe we’re not all that far off from the day when nanotech will correct all our inherited defects. I certainly have my share, and I’ve accumulated a few others along the way. Ultimately, of course, what constitutes an inherited defect will represent a novel problem of definition. Is your nose too small? There’s an ADAR for that!  (Note to my readers: nose = a family-friendly euphemism). I see a thriving industry targeting professional athletes and Hollywood actors, and soon all the rest of us. We’ll all look like Brad Pitt and Angelina Jolie, all run like Jesse Owens, and all live forever.

 

Yeah, right. And egg white and pectin are nanotech.

About the Author

George W. Sledge, Jr., MD
GEORGE W. SLEDGE, JR., MD, is Chief of Oncology at Stanford University. His OT writing was recognized with an APEX Award for Publication Excellence in the category of “Regular Departments & Columns.”