The other day in clinic I had a patient with no fingerprints, and it was my fault. I had prescribed capecitabine for her metastatic breast cancer, and she was responding to therapy, but she had developed ‘hand-foot syndrome’ as a side-effect, and as a consequence the palps of her fingers had become pink, puffy, shiny, and lacked their normal ridges: she had no fingerprints.
I joked with her that now was the time to take up bank robbing, had she a yen to drop her day job. ‘No, really’, she said, ‘it's hard to open bottles and jars. You don't know how important they are until you lose them.’ She's willing to put up with the loss of her fingerprints, because the drug is keeping the wolves at bay, but that doesn't mean she likes it.
She had, in a sense, lost part of her self, part of her identity, part of what makes us unique as human beings. Not special, perhaps: there are more special things about us than our fingertips; but unique. And, as she had discovered, that loss of uniqueness was accompanied by a loss of function, one of the many small indignities inflicted on cancer patients.
I've always wondered why we need those ridges on our fingertips (the proper term is ‘friction ridges’ or alternatively ‘epidermal ridges’). Did the extra traction they provide have some evolutionary advantage to our spear-chucking ancestors? I suspect my patient would support this thought, and nature searches for even the smallest of survival advantages.
In addition to improving our ability to grip things, those finger ridges also amplify vibrations, improving transmission to sensory nerves. We are the most tactile species on the planet, bar none, and our epidermal ridges play their small part in this successful adaptation.
I have had a genuine interest in, or at least curiosity about, fingerprints, dating back for several decades. For most of the 20th century, fingerprints were the principal means by which we identified evildoers, and fingerprinting entered the popular consciousness. We say, in discussing the results of someone's actions, ‘it has his fingerprints all over it’. Even now, when molecular profiling has become such an important part of crime scene investigation (and of medicine) we speak of ‘DNA fingerprinting. Fingerprinting is a common metaphor, immediately understood.
Fingerprints were originally imported from the East, where they were used as an identifier on business documents four millennia ago in Babylon. Fingerprinting became a criminal investigative technique in the late 19th century.
The world's first fingerprint bureau was established in Argentina in 1892. In its day it was the equivalent of today's molecular profiling: a wondrous new scientific technique that revolutionised identification.
One of its first effects, similar to current molecular profiling, was to reverse travesties of justice. Prior to fingerprinting, innumerable innocents were sent to jail based on faulty identification. A bloody fingerprint found on a weapon could not only point to a criminal: it could exonerate suspects.
Current molecular profiling has had a similar effect, releasing hundreds of innocents from jail. In most of the cases where molecular profiling has exonerated the falsely accused, individuals rotted in a cell for years based on faulty witness identification. Our memories of traumatic events are notoriously faulty: true a hundred years ago, true today.
But so, it turns out, is fingerprint identification. Standard fingerprint identification has sent people to jail, only to have molecular fingerprinting overturn the verdict. Our fingerprints may all be different, but sometimes the differences are subtle, and open to interpretation and error. The assumption that fingerprint identification is foolproof is not a valid one.
What makes fingerprints so valuable is their (relative, imperfect, incomplete) uniqueness, their ability to identify an individual. Fingerprints are, in essence, the phenotypic expression of a complex genetic identity.
Or epigenetic identity. Here is a fascinating question: do monozygotic (identical) twins have identical fingerprints? The answer is no, not even at birth. Epigenetic events, produced by the interaction of the foetus with amniotic fluid, superimposed on a genetic background, make all of us totally unique. Monozygotic twins have fingerprints that are more alike than the fingerprints of strangers, but they are never quite identical.
And, of course, this is only the beginning: we all acquire epigenetic alterations over time. Twins are far more different when they die than when they are born, or that is the lesson taught by their epigenome: the world remakes us, time changes us all. Our uniqueness, and our specialness, comes from our life experience as much as from our genetic programming.
I was briefly interested in fingerprints many years ago from a scientific standpoint. The scientific study of fingerprints is called dermatoglyphics (‘derma’=skin, and ‘glyph’=carving), and there is an extensive literature devoted to it. Altered fingerprint patterns abound in genetic disorders such as Trisomy 21, Turner's Syndrome, and Klinefelter's Syndrome, and fingerprint changes have been associated with a substantial number of medical conditions.
Publications from the 1980s suggested that breast cancer patients had a higher preponderance of specific fingerprint patterns, though the data were all over the map: more whorls in two papers, more ulnar loops in the left hand (I kid you not) in another.
I wrestled briefly with the idea of doing something in the area, but never could figure what it should be, and of course BRCA testing is a more reliable indicator of genetic risk than fingerprinting. Perhaps, though, with the identification of the epigenetic underpinnings of dermal ridge formation, and the increasing interest in epigenetic changes as early carcinogenesis events, it is time to revisit the question. Maybe (I suggest that is a hypothesis that some young investigator can waste a few years on) we can figure out carcinogenesis by understanding a patient's epidermal ridges. Or not.
Back to my clinic. It's funny, but I prescribed capecitabine for years before I noticed its effect on fingerprints (as opposed to Hand-Foot Syndrome, of which it is a subset). Then someone pointed it out to me, and now I see it all the time. Goethe wrote that ‘one sees what one knows’, and that certainly was the case with me.
The patient who lost her fingerprints is a schoolteacher, and loves teaching her kids about science. She had us cut slides off her tumour blocks so that she could show the children in her class what a cancer looks like under the microscope, a degree of personal involvement in education that I find admirable, as I do her willingness to show up for class every day while battling metastatic breast cancer.
Teachers are under-appreciated by society. We hear totally moronic expressions like ‘those who can, do; those who can't, teach’, as if teaching were easy. These expressions are now accompanied by a significant degree of negative political discourse, blaming teachers for all the ills of society, and accusing them of personal greed, as if they were investment bankers at Goldman Sachs.
Few things in life are more important, and less well rewarded, than the education of the young. My experience in the clinic, for what it's worth, is that teachers are a highly practical and routinely courageous group when it comes to dealing with cancer.
At some abstract level I can tell that my patient finds all this fingerprint stuff amusing. At the same time, she wants nothing so much as a life where she can look down at her hands and not be reminded, every single day, that she has a fatal disease requiring continuous administration of toxic chemicals to maintain her besieged individuality.
My patient wants her everyday uniqueness back, wants the comforting normality of those epidermal ridges. Some day I hope we can offer her that.