Musings of a Cancer Doctor
Wide-ranging views and perspective from George W. Sledge, Jr., MD
Friday, September 19, 2014
Recently I was dining with friends, enjoying a pleasant Palo Alto evening on their porch. We noticed a family of quail walking on the ledge of the wooden fence that enclosed the yard, a mother and her brood of children. They hopped down into the yard. A peregrine falcon was sitting atop a tall tree, off in the distance. To me it was a thin smudge, featureless and motionless, perhaps a football field away. I suspect the falcon saw every minor blemish on my face. A falcon's vision is considerably better than that of an aging medical oncologist: they can see prey three kilometers away.
At some intellectual level I was aware that falcons were predators, capable of swift, violent action. My hostess even mentioned that she hoped the falcon had not seen the covey of quail. But we gave the falcon little thought until it fell from the sky, talons out, into the gathering of quail: a large, close blur, shocking in its suddenness, striking just feet from where we sat. I could have sworn I had seen it on that distant tree just seconds before, and that was in fact probably the case: falcons have been clocked at speeds as high as 242 miles per hour during their hunting stoop. They are, as I now know, the fastest animals on the planet.
One of the memes running through oncology right now is that of "cancer as a chronic disease." If you enter it as a search term in Google you get 54,100 hits. The idea taking hold is that we are entering a time when the average patient's cancer will be held in check through the judicious use of systemic targeted agents. Years will pass, the patient in generally good health.
I must admit that the phrase "chronic disease" always gives me pause when I hear it in relation to cancer. Historically, physicians separated acute illnesses such as pneumonia from chronic illnesses such as rheumatoid arthritis or diabetes. Prior to the advent of antibiotic therapies in the 1940's it was the acute illnesses that tended to kill; afterwards there was a progressive shift to chronic disease as a cause of death.
From a definitional standpoint, the standard definition of a "chronic disease" is “a long-lasting condition that can be controlled but not cured.” Most human solid tumors meet this definition, and are recognized as such by the CDC.
But I think that when we talk about "cancer as a chronic disease" we are usually thinking of something else: basically, the cancer goes into the stasis field of modern medicine and does not emerge to kill you. My patients certainly think in those terms.
There are, of course, some real candidates for what I might consider true chronic disease status, the oncoequivalent of "take your insulin and it won't kill you": chronic myelogenous leukemia, controlled with imatinib, or an ER-positive breast cancer held in check with an aromatase inhibitor for prolonged periods. Or, to a somewhat lesser degree, metastatic HER2-positive breast cancer, or metastatic colorectal cancer, or renal carcinoma, all diseases where survival has significantly improved through the application of targeted therapies.
Scientifically this rapidly brings us to the question of treatment duration. Here we are trapped in empiricism. Mother Nature selected a year of HER2-targeted adjuvant therapy with trastuzumab as optimum. Not six months, not two years, but the time it takes the earth to circle the sun. Or maybe not: adding a year of adjuvant neratinib to your year of trastuzumab may improve disease-free survival. Three years of adjuvant imatinib is better than one year for GIST tumors post-surgery. How about five?
For early stage ER-positive breast cancer, ten years of tamoxifen is better than five years is better than three years is better than one year: every time we have studied duration, longer is better. Though, of interest, the disease-free survival curves never quite plateau: are we are just delaying the inevitable?
And if so, is it really a chronic disease? If you stop treating rheumatoid arthritis and your joints get hot again, you may be miserable but you probably will not die, and have a good chance of going back into remission with the same old drug: joints don't mutate. If you have metastatic breast cancer you are walking around under the cloud of a death sentence, any temporary stay of execution provided by fulvestrant or T-DM1 notwithstanding. Part of my problem with "cancer as a chronic disease" is exactly this: I know few individuals who would trade RA for metastatic cancer. Equating them seems somehow disingenuous.
I keep thinking of that peregrine falcon and the quail family. I have had a number of patients who I have sheparded along for years, occasionally decades, with either estrogen receptor-targeted or HER2-targeted treatments. It is easy to contract and propagate the "cancer as a chronic disease" meme with such patients. The clinic visits become celebrations of the doctor's therapeutic virtuosity and the patient's family anniversaries, with drugs being switched in and out every now and then when the CAT scan demonstrates modest progression.
You can almost fool yourself into believing that the patient doesn't have a lethal disease. And then the falcon swoops in. “Chronic disease” can turn into “acute and lethal” in a shockingly brief period. One clinic visit you are discussing an upcoming high school graduation, and then the next you are having an end of life discussion.
I do not know why it is--perhaps because the patients have transitioned from being patients to old friends--but the death of a patient whom I have treated for a decade usually hits me harder than one who dies quickly. There is no intellectual reason for this: the death of a long-term survivor suggests I was actually doing my job, and I should be delighted that I have added many years of high quality life rather than just a few. But it always leaves a bitter taste in my mouth. I doubt it is an experience a rheumatologist deals with very often.
Like some old soldier looking over the parapets at a distant enemy, I have learned to respect my foe's endless ingenuity, its treachery, its patient evil, and its almost maniacal drive to escape the barriers I try and throw up around it. We have, again and again, seen the ultimate failure of kinase-based approaches that have dominated the last decade, the consequence of smart tumor’s ability to mutate and evade.
The real "chronic disease" possibilities may lie within the realm of immune checkpoint inhibitors, where some (though certainly not all) melanomas appear to go into stasis for years and years, the body's immune system unleashed to keep the wolves at bay. The idea of metastatic melanoma as a chronic disease still boggles the imagination, but it certainly looks to be a real possibility for many patients. Whether I want my T regulatory system ramped up for the next decade or two remains to be seen. I suspect we will be keeping the immunologists and rheumatologists very busy: more chronic disease.
Are there any other approaches that might turn human cancer into a true chronic disease?
I present for your edification the naked mole rat. These are probably the ugliest mammals on the planet, almost disgustingly so. They are also the longest-lived rodents we have, clocking in at 30 years, a good life if living in total darkness underground in what one leading student of the species has described as a “dictatorship” is your cup of tea.
But here is the really interesting thing: they never die of cancer. I don't mean rarely, I mean never.
It doesn't matter how hard you try, either. Pump them full of carcinogens, and they just go on their merry, repulsive way, burrowing through the earth as if nothing had ever happened. And why is this? It turns out that the naked mole rat produces generous amounts of a high molecular weight form of hyaluronic acid, an evolutionary adaption that changes their skin to something quite stretchy and allows them to easily traverse underground passages. This form of hyaluronic acid has the pleasant side effect of walling off individual cancer cells before they can gain mass or metastasize. The naked mole rat buries the cancer cell in concrete. This is a spectacular example of a microenvironmental approach to cancer.
Cancer cells live in neighborhoods, and for the past decade or so we have been enlisting the neighbors (blood vessels, T cells) in the neighborhood watch. These microenvironmental approaches have started to take off, and I suspect (based partly, but not entirely, on the naked mole rat story) that we just at the beginning of such interventions. Hopefully we will not end up looking like naked mole rats.
Naked mole rats also have exceptional transcriptional fidelity. This error-free life also protects them from cancer. So taken are laboratory researchers with the naked mole rat that Science magazine declared it the “Vertebrate of the Year” in 2013, and there has been an explosion of scientific articles plumbing the rodent’s exceptional longevity and freedom from cancer. And, of course, a naked mole rat genome project.
But back to the falcon and the quail. For a brief second after the falcon struck, we were unsure of the outcome. We couldn't see whether the falcon had succeeded in snatching one of the young covey. Then there was an explosion of quail, noisy, vectored in multiple contradictory directions, their panic reigning supreme. And then, much more quietly than the quail and more slowly than he had arrived, the falcon took off, talons empty: not today. Not today. Not today!
Sunday, August 24, 2014
The crickets in my back yard have been particularly noisy these last few days, louder than they have been all summer, so loud they have kept me awake with their chirping. But I enjoy listening to them. Sometimes I just hear one, sometimes a veritable orchestra, or at least a robust string section. It is, for reasons I cannot explain, a deeply comforting sound.
I have had warm feelings towards them since my youth. I grew up as part of a generation raised on Walt Disney's Pinocchio, where a charming Jiminy Cricket served as (somewhat ineffective) conscience for the long-nosed wooden boy. Hollywood anthropomorphism favored crickets, and why not? Unlike mosquitoes and lice, these insects never mean us any harm.
As a child I was fascinated with their amazing ability to tell me the temperature. Maybe you learned this as I did. Count the chirps in 15 seconds, add 40, and you have the temperature in degrees Fahrenheit. This observation was formulated as Dolbear's Law in 1897, and it still works, more or less. More or less, because there are over 900 cricket species worldwide, and they do not all chirp, and do not chirp at the same rate for a given temperature. So Dolbear’s law is not exactly a law of nature.
I was taught that these living thermometers rubbed their hind legs together to create the chirps, and believed it for decades, but it just isn't so: the crickets have something called a stridulatory organ, a large vein running along the bottom of each wing. In fact, the scientific name for chirping is stridulation; I’ll stick with chirping. Crickets run the top of one wing along the bottom of the other to create the sound. And, in case you are wondering, crickets have a tympanic membrane to hear the chirps--though oddly enough it is located just below the knee.
Why crickets chirp is another matter: largely this is mating behavior, the male of the species announcing himself to potential mates. Entomologists distinguish four separate chirping behaviors, including a calling song that attract females and repels other males, a quiet courting song used when a female is near, an aggressive song triggered by the near presence of other males, and a brief copulatory song produced after a successful mating. You can’t make this stuff up.
The timekeeping aspect of the chirping has nothing to do with its underlying reproductive purposes. Crickets are, like all insects, cold-blooded, and their chirpings heat up along with their bodies. The thermometer is coincidence, an artifact of physicochemical design, albeit a happy one for a six-year-old boy on a warm summer’s night in a field in Wisconsin.
Nature is full of living thermometers. Measuring temperature must be something basic to all living organisms, for even lowly bacteria are capable of it: they contain temperature-sensing RNA sequences, known as RNA Thermometers, or RNATs, in their mRNAs. RNATs control virulence, heat shock, and cold shock genes in E. coli.
Their two structures are simple but clever: a zipper-like structure that gradually melts as temperature increases, and a switch mode, where two mutually exclusive structures depend on ambient temperature. RNATs are such a clever natural design that they are now being co-opted by biotechnologists.
Sometimes these internal thermometers can have seemingly bizarre purposes. Red-eared slider turtles (and many other reptiles) lack sex chromosomes. Turtle gender is determined by the temperature at which their eggs are incubated, with colder temperatures producing males and warmer temperatures females. The cut point is right around 29oC.
How this occurs has been partially elucidated in recent years. Aromatase (which, as all good medical oncologists know, converts androgens to estrogens) is under epigenetic control, so that, in the words of a recent PLOS paper, “female-producing temperature allows demethylation at the specific CpG sites of the promoter region which leads the temperature-specific expression of aromatase during gonad development.” And, in case the breast cancer docs are wondering, you can change the turtle’s gender by exposing its egg to letrozole. And you thought it was just a breast cancer drug.
But mammals are the ultimate living thermometers. If you are a warm-blooded animal whose edge over crocodiles and snakes involves continuous thermoregulation (and we live within a very narrow temperature range), then you need to have some means of measuring your degree with precision.
And we are quite good at it, we humans. The skin at the base of our thumbs can perceive temperature differences of 0.02-0.07o C. I find this little short of amazing: our fingers are thermometers. The explanation for this impressive ability is an evolutionary masterpiece. Temperature-sensitive transient receptor potential channels (or thermoTRPs) are a family of ion channels, activated at different temperature thresholds, each exquisitely sensitive to a particular temperature range. One of them, TRPV1, is also activated by capsaicin, which is why those red-hot chili peppers make your throat feel like it is burning up.
We rarely think of this internal thermometer, perhaps because it is hidden in the background, unlike the more showy, in-your-face senses of sight and hearing and taste and smell. It gets lumped in with "sense of touch" and promptly forgotten. And, since the invention of thermometers in the 17th century, we have rarely felt the need to rely on it, or even recognize its existence.
We are the only species on the planet to have an external thermometer--two if you count our cricket biothermometers. But the old thermometer is still there, sitting quietly in the back.
This temperature sense is hard-wired into our psyche, as our language (indeed, every human language) commemorates. We speak of "an icy stare" or say that a relationship is "heating up" or that someone has a "burning desire" or of a lawman being "in hot pursuit" of a criminal, only to discover that "the trail has gone cold." We live by metaphors, and temperature metaphors are exceptionally common.
It goes even deeper than language. Psychological studies have shown that the simple act of handing someone a warm cup increases interpersonal warmth, and that someone excluded from a conversation will judge a room to be cooler than one who is included. Our subconscious is deeply invested in temperature, and it is wired into our internal thermometer.
The oncologist in me always wonders whether such things affect cancers. Temperature dysregulation is, of course, common in cancers such as Hodgkin’s disease, and a rare complication of treatment for Hodgkin’s is prolonged hypothermia (lasting up to 10 days) following chemotherapy administration. I’m sure there is some interesting biology there, perhaps involving thermoTRP’s, but at the end of the day it probably doesn’t matter all that much.
Thermoregulation has not been a huge therapeutic player in the cancer field, despite half-hearted attempts at cryosurgery and hyperthermia. Are cancers essentially cold-hearted, or do they burn with the desire to harm? Or both at the same time? Metaphor only carries you so far, and attributing emotions to lethal Darwinian machines is pointless. But when I lay my hand on an inflammatory breast cancer I am always impressed, sometimes even shocked, by its malignant heat, its angry redness. Perhaps ion channels are irrelevant to their darker purpose, or perhaps like so many other things we haven't looked closely enough.
As it turns out, there is a growing literature on capsaicin as an anti-cancer agent: the active agent in chili peppers induces necrotic cell death in cancer cells, and does so in direct relation to TRPV1 expression levels. Is that cool, or what? Or a hot research area? Whichever.
So I sit on my back porch pondering all this as the evening proceeds, the temperature gradually falling, the symphony of chirps slowing bit by bit, and eventually dying out. Perhaps the crickets have found their special ones tonight. Is that a celebratory chirp I hear?
Thursday, July 31, 2014
Recently, in preparation for a lecture I gave on writing scientific papers, I had the opportunity to re-read Watson and Crick’s original 1953 Nature paper, “Molecular Structure of Nucleic Acids: A Structure for Deoxyribose Nucleic Acid.” It is not just the most important scientific paper in biology written in the last century, it is also one of the most beautifully written: concise, clear, jargon-free, and with the greatest punch line for any paper in the scientific literature: “It has not escaped our notice that the specific pairing we have postulated immediately suggests a possible copying mechanism for the genetic material.” Try and top that for a discussion section.
I downloaded the paper from the Nature website, which required me to go to the actual April 25, 1953 issue. My eyes wandered for a few seconds over the table of contents for the issue, and suddenly I had a thought: I wonder what it was like to publish a paper in that issue of Nature if your name wasn’t Watson or Crick? To be remembered, forever in the annals of science, as one of the also-rans.
There are many equivalents, both inside and outside science: giving a talk at the Linnaean Society in 1858 in the same session where Darwin and Wallace’s first papers on evolution were presented. Or, perhaps, being the back-up act on the 1964 Ed Sullivan Show for the Beatles. A wonderful trivia question, by the way: Las Vegas entertainers Allen & Rossi, and the impressionist Frank Gorshin, in case you want to wow someone at a dinner party.
6 Articles, 23 Letters
So who were the also-rans? There were six articles and 23 letters in the issue. Of the six articles, three were about the structure of DNA. In addition to the Watson and Crick article, there was Maurice Wilkins’ “Molecular Structure of Nucleic Acids: Molecular Structure of Deoxypentose Nucleic Acids” and Rosalind Franklin’s “Molecular Configuration in Sodium Thymonucleate.”
Franklin’s paper includes the beautiful x-ray crystallography that has become a commonplace for biology texts ever since, a ghostly X of salmon DNA.
The Wilkins and Franklin papers represented important support to the Watson-Crick modeling paper, which is why the two Cambridge researchers had encouraged their King’s College colleagues to submit papers at the same time.
The interaction of Watson and Crick with Wilkins and Franklin (both of King’s College. London), and for that matter the fraught relationship of Wilkins and Franklin, are part of the lore of molecular biology. Watson and Crick ran no experiments, relying instead on the work of others, most prominently Rosalind Franklin’s X-ray diffraction studies of DNA, which they correctly interpreted as evidence in favor of a double helix. It was a crucial piece of experimental data, and one that both Wilkins and Franklin had misinterpreted. Watson’s classic memoir, The Double Helix, describes the interactions in picturesque detail, and his description of Franklin in particular struck many then and later as both petty and sexist.
The Nobel Prize can be given out to a maximum of three investigators, and in 1962 Wilkins was investigator number three. Franklin had, in the intervening years, developed ovarian cancer, and died at the age of 37, some four years before the prize was awarded. She spent some of her final months with Crick and his wife, receiving chemotherapy. The oncologist in me has always wondered whether the Jewish Franklin carried an Ashkenazi BRCA mutation.
Would she have been awardee number three had she lived? This is an unanswerable question, but her premature death doomed her to permanent also-ran status, making her a feminist icon in the process. Could she have come up with the double helix structure on her own, given sufficient time? Perhaps. But her Nature article concludes with a sad, might-have-been-but-wasn’t, admission: “Thus our general ideas are not inconsistent with the model proposed by Watson and Crick in the preceding communication.” Not inconsistent with: a grudging admission that she had failed to see what her own data supported.
Editors of scientific journals, and contributors to those journals, scrutinize the impact factor. The impact factor is quite simple: add up the total number of citations for a journal in the first two years after publication, and divide by the number of citable items (usually articles and reviews). The best journals (Science, Nature, Lancet, New England Journal of Medicine) usually have the highest impact factors. The vast remainder are decidedly mediocre: a sea of journals with an impact factor of 2 or thereabouts. Most published science is unmemorable, rapidly forgotten even by the authors. Citations do not have a Gaussian distribution; instead they follow a power law distribution, with a few 900-pound gorillas followed by a long tail. And seldom has the power law been so severe as that issue of Nature.
Impact factor is a way of keeping score, though not without its own issues. One problem with it is that it cannot measure the long-term impact of a paper. With 60-plus years of follow-up, however, we can look at the long-term impact of the work published in that April 25, 1953 issue of Nature. Watson and Crick are the huge winners, of course: Google Scholar says their paper has been cited 9866 times, and we hardly need the citation numbers to realize the revolutionary importance of that paper. The Wilkins and Franklin papers clock in at 618 and 833 citations, marking them as important contributions. But what of the others?
Let’s begin with the other three articles. They are, in order of presentation, “Refractometry of Living Cells”; “Microsomal Particles of Normal Cow's Milk”; and “Measurement of Wind Currents in the Sea by the Method of Towed Electrodes.” None is particularly remembered today, I think it is safe to say. Google Scholar reports the three articles as having had, respectively, 172, 41, and 15 citations. OK citation numbers, even today, but definitely in the also-ran category. Interestingly, the refractometry paper was cited by a 1991 Science paper on optical adherence tomography that itself has been cited 8400 times.
The letters range from 0 to 235 citations, according to Google Scholar. The most recognized is W.R. Davis’s “An Unusual Form of Carbon”, which has been cited 235 times, most recently in 2010. The story here is interesting. Davis and his colleagues worked at the British Ceramic Research Association in Stoke on Trent in England, where they studied the carbon deposited on the brickwork of blast furnaces. They identified what they described as tiny “carbon vermicules,” some as small as 100 angstroms. In retrospect (hence the 235 citations) they were early discoverers of carbon nanotubes, members of the fullerene structural family, now actively studied for their fascinating physicochemical properties. Three researchers received Nobel prizes in the 1996 for fullerene studies, so one can feel for Davis and his fellow Stoke on Trent ceramicists. They were dual also-rans, publishing in the same issue as Watson and Crick, and coming along a few decades too early in the as-yet-unnamed fullerene field.
Of the other papers published as letters, what is there to say? I love the title of K.A. Alim’s “Longevity and Production in the Buffalo,” though R. Hadek’s “Mucin Secretion in the Ewe's Oviduct” runs a close second for my affections.
But is easy to understand why such articles are poorly cited and long forgotten, given the relative obscurity of the topics. One of the keys to success in science is choosing the right problem to work on, and mucin secretion in the ewe’s oviduct probably cannot compete with decoding the key to life on earth.
Least Cited, But...
The least cited paper (I could not find any citations using Google Scholar) is my favorite: Monica Taylor’s “Murex From the Red Sea”. Taylor wrote from Notre Dame College in Glasgow, where she curated a natural history collection. Sir John Graham Kerr had collected some 300 Murex tribulus shells from the Great Bitter Lake in Egypt. Murex was what would now be called an alien invasive species, introduced following the completion of the Suez Canal, and Sir John (would a 2014 letter to Nature ever refer to a Sir John?) and Monica wondered whether the species had altered its presentation, “relieved of the shackles of environment.” The letter was a plea for specimens from the Red Sea so that the comparison might be made. Wikipedia informs me that Murex tribulus is a large predatory sea snail, with a distribution from the Central Indian Ocean to the Western Pacific Ocean.
Did Monica Taylor ever get her Red Sea specimens? Life is full of small mysteries. Taylor herself is a fascinating soul. Born in 1877, the daughter of a science teacher and the cousin of Sir Hugh Taylor, one-time Dean of the Princeton Graduate School, she trained as a teacher prior to becoming a nun. So Monica Taylor was actually Sister Monica, and Sister Monica dearly wanted to become a scientist. But the road was not smooth for a woman, let alone a nun, wishing to be a scientist in the early twentieth century. She was refused permission to attend the University of Glasgow, and was unable to complete an external degree from the University of London due to Notre Dame’s inadequate laboratory facilities.
After several thwarted attempts, according to the University of Glasgow website, “She was eventually granted permission to do laboratory work in the Zoology Department of the University of Glasgow, provided she did not attend lectures and was chaperoned by another Sister at all times.” There she impressed Professor Graham Kerr, who encouraged her to pursue an advanced degree, and obtained permission for her to attend classes. After receiving a DSc from the University of Glasgow in 1917, she headed the Science Department at Notre Dame College until her retirement in 1946, all the while conducting significant research in amoebic zoology.
In 1953, the year of her Murex letter, she was awarded an honorary doctorate from the University of Glasgow for being "a protozoologist of international distinction." She died in 1968; six years after Watson and Crick got their Nobel prizes. No citations, and no Nobel, but perhaps you will remember this also-ran, a woman of courage and fortitude.
Most of us are also-rans, if judged against those carved into science’s Mount Rushmore. Glory would not be glorious if it was common. But maybe we have it wrong if we think the also-rans felt demeaned by their also-ranness. Maybe Dr. Alim or Dr. Hadek or Sister Dr. Taylor enjoyed their brush with greatness. And maybe, just maybe, they were satisfied with lives well lived in service to science and mankind.
Monday, June 23, 2014
I recently read Dave Eggers’ brilliant and frightening novel The Circle, his modern take on Orwell. In 1984 the totalitarian state was the enemy of freedom, a hard tyranny motivated by the state’s desire to crush all opposition. Eggers thinks we are headed for a much softer tyranny, one dominated by do-gooder Silicon Valley business types determined to eliminate—indeed, undermine the justification for—any attempt at privacy. 1984 has come and gone; the world of The Circle careens towards us at high speed.
The sphere of privacy continues to shrink. If it even is a sphere anymore; sometimes it seems shriveled to a point that is no longer three dimensional, almost a bodiless mathematical concept. Hardly a day goes by without some new assault, or potential assault, finding its way into the news. Some examples are from the scientific literature, and some from current politics, but the vector is always the same: towards an era where privacy ceases to exist.
Much of this is technology-driven. That technology affects privacy rights is nothing new. Previous generations had to deal with wiretapping and listening devices, after all. But this seems both quantitatively and qualitatively different.
Here are a few I’ve collected in the last year. What impresses is the sheer variety of attacks on privacy, coming from every degree of the compass:
· An article in PNAS demonstrates that your Facebook "likes" could do a pretty good job of defining your sexual orientation (85% predictive ability for male homosexuality), race, and political identity.
· The FBI recently announced that it had deployed drones on American soil four times in recent years. Police departments are considering doing the same in American cities, supplementing the now ubiquitous CCTV cameras.
· A group of Harvard engineers recently created a quarter-sized “fly”, a robotic insect capable of controlled flight (for a first look at the future look at http://www.youtube.com/watch?feature=player_embedded&v=cyjKOJhIiuU). Give it some “eyes”, deploy a few hundred, and then imagine some hovering outside your bedroom window. Flutter, flutter, buzz, buzz. Is that a mosquito I just slapped, or some investigator's eyeballs?
And while I could not exist anymore without Google, the company has been the great destroyer of privacy. If you want to know someone's life history, it is probably out there somewhere on a server farm. Try a little experiment: Starting with a colleague's name, see how much personal information you can find in an hour. Next, imagine how much of that information you could have found in an hour a decade or two ago. You would be amazed.
Maybe that's a good thing: it depends on who's searching, and why. If you don’t want to date a convicted felon, going online is a good way to avoid that particular mishap. But it is, of course, much more than just that. Every little thing about you can end up out there. When I was a teenager we used to laugh at school administrators who claimed that some minor infraction would “become part of your permanent record.” Not anymore: the record is not only permanent, it is global. In some important way there are no second chances anymore.
Google (don't be evil) is responsible for other transgressions. We now know that Google StreetView collected more than pictures of your front door: they went behind the door to collect passwords, email, and medical and financial records, as the company admitted in court. The court fined the company a half-day of its profits, which I suppose tells you how much the courts value your privacy. Google also promised to go and sin no more, which I know will reassure us all. And they will now require employees to take the computer industry equivalent of a HIPAA course. All right, I admit, cruel and unusual punishment.
These Internet assaults are layered on top of other privacy intrusion: ubiquitous CCTV’s following you down city streets, your cell phone movements describing where you have travelled with great precision, your credit card history demonstrating a financial phenotype, your checkout at the grocery story generating targeted advertising on the back of the receipt, and your web history now the technological equivalent of a trash can, to be sorted through by private investigators digging for dirt. Sometime in the last decade privacy became a fossil word. The expectation of privacy is disappearing from public discourse.
About the only data that doesn't want to be free is personal health information, and that only because accessing it is a felony. How long before technology effectively eliminates medical privacy? For the moment, at least, medicine remains Privacy Island. Sort of, anyway: all around us the sharks circle, waiting to strike. Don’t step too far off the shore.
· An article in Science showed that one could, by merging publically available “de-identified” data from the 1000 Genomes project with popular online genealogy data, re-identify the source individuals.
· If a patient accesses a free online health website for information on “cancer” or “herpes” or “depression”, according to Marco Huesch of USC, that information will, on average, be shared with six or seven third-party elements.
I used to think that conspiracy theorists were crackpots, with their ideas of the government tracking us via RFID-encoded greenbacks through airport security checkpoints. It turns out that the crackpots were right, though wrong about the purveyor (the TSA? Seriously? The dysfunctional guys who still can't stop people from carrying box cutters through security checkpoints?).
It’s much closer to home than that, as I’ve recently discovered. Hospitals are taking a deep dive into something called RTLS, or Real Time Location Services. I found out about this technology (as usual, I am the last to know) through service on a committee overseeing the building of a new hospital. RTLS places RFIDs on every large object (in RTLS-speak, “assets”) as well as on patients and hospital employees. It can then localize all of these to within three feet, via carefully placed receivers scattered throughout the building. Several companies (Versus, CenTrak) have sprung up to commercialize this technology.
These devices have real uses. They keep large, expensive pieces of equipment from walking out the door, or at least allow you to track which door they walked through. Recently a hospital in Munster, Indiana suffered an outbreak of MERS. Because hospital employees were tracked through Versus RTLS technology, the hospital was able to hand CDC investigators a list of everyone who had come in contact with the patient. Sounds great, right? An epidemiologist’s dream.
Perhaps. The cynic in me says that sometime in the near future some hospital administrator will use the technology to decide, and chastise, employees who spend too much time in the bathroom. Administrators just can’t help themselves.
RTLS is part of the larger “Internet of Things”, or IoT as it has come to be called. A recent article in The Economist suggests that by 2020 some 26 billion devices will be connected to the Cloud. Consider IoT’s potential. Will I end up with a brilliant toaster, one that knows exactly how long to brown my bread? A mattress that diagnoses sleep apnea? A lawn mower whose handles sends my pulse and blood pressure to my internist? A microwave oven that refuses to zap hot dogs when I go over my fat gram budget for the day? Maybe I’ll be healthier. I’ll certainly be crankier: the soft tyranny of the IoT will drive me off the deep end in a hurry.
The bottom line, though, is this: we are entering an era of ubiquitous monitoring, and it is here to stay. I wish I was (just) being paranoid. Here are the words of a recently released government report entitled Big Data: Seizing Opportunities, Preserving Values: “Signals from home WiFi networks reveal how many people are in a room and where they are seated. Power consumption data collected from demand-response systems show when you move about your house. Facial recognition technologies can identify you in pictures online and as soon as you step outside. Always-on wearable technologies with voice and video interfaces and the arrival of whole classes of networked devices will only expand information collection still further. This sea of ubiquitous sensors, each of which has legitimate uses, make the notion of limiting information collection challenging, if not impossible.”
Discussions on privacy are frequently cadged in terms of crime or terrorism: wouldn't you prefer to give up another bitty little piece of your soul, a loss you would hardly notice, to avoid another 9/11 or even an armed robbery? The security state, adept at guarding or criminalizing the release of information harmful to its reputation, cannot imagine its citizens deserving or even wanting the same protections. One is reminded of Ben Franklin’s prescient statement: “They who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.”
But government intrusions are becoming almost irrelevant. I doubt the TSA or NSA or FBI or CIA or any other of the alphabet soup of government agencies really cares very much about my Internet history. It is, rather, the Pacific ocean of data being collected, and the certainty that it will be shared with, well, everyone. Remember the words of that government report? “This sea of ubiquitous sensors, each of which has legitimate uses, make the notion of limiting information collection challenging, if not impossible.” Challenging, if not impossible: I’ll opt for impossible.
And will we care? I doubt it, so long as the data theft remains quietly in the background, being used for purposes no more nefarious than designing advertising strategy or predicting sales.
Monday, May 26, 2014
Over time I have accumulated numerous defects. When I was four years old I hypothesized that I could balance on top of a large beach ball. I was wrong, and in proof of the null hypothesis received several stitches over my left eyebrow. When I was in fifth grade I broke a finger playing football. Six weeks later the splint came off. I went outside that same afternoon, played football, and broke another finger on the same hand. My GP looked at me with mild disgust and said, "You clown," having used up all his compassion with the first fracture. The following year I assaulted a plate glass window with my left knee, leaving another scar.
In high school I spent a day helping my dad put shrubs and trees--a veritable forest of shrubs and trees--into our back yard, after which my back went into spasm. I have never had it imaged, but I am pretty sure I herniated a disc. Every two or three years it reminds me of that day in my teens. Joyce Kilmer obviously never spent an afternoon planting trees, or he would have found them less charming.
I was blessedly free of further physical insults for nearly a decade. In my fellowship I had a varicocoele ligation: more scars. Early on as an Assistant Professor I found that I had lattice degeneration, a precursor lesion for retinal detachment. I was told not to worry about it unless I found myself going blind suddenly, and not to play football without a helmet. I haven't gone blind yet, but my football days are over. On a recent exam I was told that I had early cataracts. And, by the way, I do not appreciate being labeled a degenerate by an ophthalmologist.
And don't even get me started on my teeth. A bunch of my adult teeth never came in, a condition known as hypodontia. I kept several of my baby teeth well into my 40s before they all eventually fell out, to be replaced over time with dental implants. According to Wikipedia, hypodontia is statistically linked to epithelial ovarian cancer, which I guess is interesting. As to etiology, Wikipedia says “the condition is believed to be associated with genetic or environmental factors during dental development.” Duh. Anyways, some defects you accumulate, some you are born with.
I've accumulated a number of benign neoplasms and the like: nevi, warts, a few small colon polyps (resected times 2 per colonoscopy). I think I have an early actinic keratosis, some lichen planus, and a few other nonthreatening-looking skin nodules that I am sure are named after someone or have some Latin name known only to dermatologists, their verbal equivalent of a secret handshake.
A couple of winters ago, on Christmas Eve, I ran into a pillar in the foyer of my house: more stitches, this time above my right eyebrow, more scar tissue. Shortly after that I caught my heel on a splinter while climbing up the basement stairs. I hobbled around for a week or two, self-medicating, until I had sufficient scar tissue to regain unhampered mobility.
Of course it's the defects you accumulate internally that you worry about. Once you get past the skin or gastrointestinal mucosa it's hard to detect them with ease. My cholesterol is up there without a statin, and my glucose level has suggested from time to time that I may be prediabetic. Coronary arteries, carotid arteries, pancreas? Which one is lying in wait to do me in? The problem with getting older is that you take increasingly frequent journeys to Comorbidity Island, accumulating defects like some crazy recluse collect cats.
Below the tissue level, the medical oncologist in me always worries about the accumulation of genetic defects. Our DNA is under constant assault: one paper I read, looking at a mouse model, estimated somewhere in the range of 1500 to 7000 DNA lesions per hour per cell. Most of them don’t take: we’re fitted out by nature with excellent DNA damage repair mechanisms.
Still, it gives you pause. Some of those mutations slip through. Over time they add up. Some combination of mutations in some cells eventually results in cancer, as we all know. But there is also a theory of aging (there are lots of theories of aging) implicating the progressive accumulation of genetic defects for the loss of organ function as we grow older. Take your muscles, for instance. The older you get, the more mitochondrial DNA mutations you accumulate. At some point, muscle fibers lose cytochrome C oxidase activity due to these mutational defects. Your muscles rot.
This accumulation of mutational defects is quite organ-specific, at least in mice: Dolle and colleagues, writing in the PNAS over a decade ago, showed that the small intestine accumulated only point mutations, whereas the heart appears to specialize in large genome rearrangements. They spoke of “organ-specific genome deterioration,” a term that does not give me warm and fuzzy feelings.
And these mutations start accumulating at a very early point in life. Li and colleagues at McGill studied identical twins in a recent Journal of Medical Genetics paper. Identical twins are that great natural experiment that allows us to ask “just how identical is identical?” Looking at genome-wide SNP variants in 33 pairs of identical twins, they estimated a mutation frequency of 1.2 X10−7 mutations per nucleotide. This may not seem like much, but given three billion base pairs in the human genome, they concluded “it is likely that each individual carries approximately over 300 postzygotic mutations in the nuclear genome of white blood cells that occurred in the early development.”
Our cells are all potentially mutinous traitors, held in check only by the determined efforts of the molecular police force. We carry the seeds of our ultimate demise from birth.
Well, that’s all very cheery, you say. We all know that we grow old, and then we die, and the growing old part is only if we are lucky. This isn’t exactly late-breaking news. Applying a molecular mechanism to mortality is interesting, but my hair is still getting grayer by the day. Do something!
And something cool they are doing. Villeda and colleagues just published a paper in Nature Medicine looking at cognitive function in old mice. Mice, like humans, get duller as they get older. But give the old mice a transfusion of plasma from young mice, and you reverse age-related cognitive function. This has a somewhat vampirish sound to it: the old guy rejuvenating after sucking the life force out of some sweet young thing. But apparently Hollywood got it right.
Before I start lining up medical student “volunteers” for blood transfusions, can we drill down a bit on what is actually happening? Two new papers in Science suggest that the ”young blood” factor responsible for the cognitive improvement is GDF11, a circulating member of the TGF-b family. GDF11 increases neurogenesis and vascular remodeling in the brain. It also reverses cardiac hypertrophy in old mice. Unsurprisingly, the lead investigators are already speaking to venture capitalists to commercialize the findings.
It’s always a leap from mouse to man, and certainly a danger to assume that a single molecule will be a sovereign cure for all our ills. But I’ll sign up for the Phase II trial. I could use some neurogenesis right now.
Nothing in life is free, of course. The dentate gyrus, part of the hippocampus, generates neurons throughout adult life, and these adult neurons are incorporated into pre-existing neural networks. That’s a good thing. But it comes at a price: a paper just published in Science also shows that adult hippocampal neurogenesis promotes forgetting.
So, the dilemma we may face a few years from now: you are starting to lose your memory. You take GDF11 or whatever the very expensive synthetic equivalent is called in 2024. It promotes neurogenesis and you get a new lease on life. But you forget your spouse’s birthday in the process, and maybe some of what makes you “you.” Is the new “you” really you any more?
One is reminded of the Ise Grand Shrine in Japan. This beautiful Shinto wooden temple is rebuilt to the same design every 20 years, using trees from the same forest. Is the 2014 temple the same temple as the 1014 temple? Last year represented the 62nd time the temple was rebuilt. Ise is forever new, forever ancient. Maybe we will be too.
But I doubt it. We’ll still keep accumulating defects. And, being a drug, GDF11 will do something bad and unexpected. But it should be fun to find out.