Current technology in the form of wearable health-monitoring devices, such as Apple Watches, has the potential to hold a sheer volume of health data, such as heart rate and blood pressure. The most recent advancement of the Apple Watch will feature an application that can generate an electrocardiogram to be shared with a doctor at any time.16 Although these advances have the potential to be used for personal purposes, they will also contribute an immense amount of data to researchers. For instance, Stanford Medicine has partnered with Apple to use the data that are collected via the Apple Heart Study application to identify and notify patients of irregular heart rhythms.17 Not only will this study alert participants of potential serious heart conditions, but it will contribute a large amount of data to be used as “big data.”17 In addition, a large source of big data is electronic health records (EHRs). The intended purpose of EHRs is to assist in the rapid retrieval of information and to augment skills such as patient–physician communication.18–20 Although EHRs can theoretically provide instant and potentially organized access to patient information, the vast amount of information embedded into these digital records is often difficult to sort and apply expediently. Thus, the integration of big data analytics, such as artificial neural networks (ANNs), could provide a quick analysis of data to provide an output that is coherent and meaningful.7 ANNs are computing systems that mimic the biological neural network of a human brain, processing signals and making connections within data.7 ANNs use an algorithm to quantify and organize a set of data to recognize patterns that otherwise would be missed by a human.7Figure 3 demonstrates the process of ANNs, showing the input, hidden, and output layers. The input layer represents the data that are inserted into the model, whereas the output layer represents the predictions that are made by the model. Most importantly, the hidden layer represents the algorithmic layer that recognizes any patterns within the data.6 Researchers in the field of biomedicine have already integrated big data analytics to extract, summarize, and interpret knowledge from rapidly generated data, which has improved accuracy in predictive modeling.21 Introducing this digital technology to surgery would not only give surgeons quick access to the information stored in EHRs, but they will have insight into any patterns recognized by ANNs to aid them in their decision-making process.
Within the past few years, AI has established a niche in different areas in the field of surgery.7 For example, researchers have investigated the use of robotics and AI to assist surgeons in keyhole neurosurgery.22 Furthermore, a robotic system equipped with AI algorithms was designed to perform ex vivo and in vivo bowel anastomosis.23 This study showed that surgical tasks that require human skills, such as dexterity and cognition, can indeed be programmed and executed with robotic systems.23 Additionally, surgeons have used AI-assisted surgery to perform sutures on small blood vessels in a patient who was suffering from lymphedema.24 The robot, manually controlled by a surgeon, demonstrated its ability to make precise movements and stabilize any tremors in the surgeon’s movements.24 Although these techniques have not yet been applied to plastic surgery, this does not mean we cannot imagine AI-assisted surgery in the procedures that we perform. As plastic surgeons, it is now time for us to leverage our creativity to embrace these technological developments to advance our field and provide better care for our patients.
POTENTIAL CLINICAL APPLICATIONS IN PLASTIC SURGERY
For plastic surgery procedures such as breast augmentation, breast reduction, and breast cancer reconstruction, precision medicine can provide patient-specific risks for breast-related diseases. AI can assist in the decision-making process that incorporates breast cancer prevention, surgical options, and postoperative monitoring. In fact, researchers have already investigated the integration of AI in breast cancer prevention through image detection.25 For example, intelligent imaging has been used to differentiate high-risk from low-risk breast lesions, considering lesion characteristics, in addition to patient-specific information, such as family history and environmental factors.25 Furthermore, the use of intelligent imaging has shown astounding abilities to process a vast amount of images with a reduced rate of error.26,27 For example, in a detection task for metastatic breast cancer in whole slide images of lymph node biopsies, researchers showed that the integration of machine learning reduced the error rate from 3.4% (predictions of a pathologist alone) to 0.52% (predictions of a machine learning algorithm and a pathologist).26 The observed reduction in error rate demonstrated the power of integrating AI to improve diagnostic accuracy.26 However, this technology is not yet routinely used in the assessment of risks in augmentation or reduction mammoplasty patients.
The recent emergence of breast implant–associated anaplastic large cell lymphoma (BIA-ALCL) as a concern is an excellent example of risk mitigation through this technology.28 In 2008, researchers reported an increased risk for BIA-ALCL for patients with breast implants.28 Previous studies have examined the absolute risk of the disease, ranging from 1 in 3,000,000 to 500,000 patients with breast implants per year, and its association with textured breast implants.29–31 However, little is known about the disease because of its low risk and rarity. AI technology in the form of big data analytics, such as ANNs, could provide a gateway to collect, organize, and distribute data on BIA-ALCL. From 2008 to 2015, the number of unique cases of BIA-ALCL increased from 5 to 173.28 As more information is collected, machines could be used to efficiently store these data in one place, making the information accessible to thousands of plastic surgeons worldwide to provide a better understanding of the disease, such as the etiology and risk of the disease. Furthermore, ANNs could analyze these data to recognize patterns in the genetic predispositions or environmental status of patients with BIA-ALCL to assist in risk stratification. A task that would take humans months or even years to accomplish, ANNs could quickly identify patients who are at a higher risk of developing BIA-ALCL to guide surgeons early on toward the right steps for treatment.
Overdiagnosis of breast cancer on screening mammography may also potentially be minimized with machine intelligence.25 Recent studies have used convolutional neural networks (CNNs) as a machine learning model in the context of breast screening.32,33 Although CNNs are more common in the context of natural images, such as facial recognition tasks, its effectiveness has been proven recently in its ability to learn automatically and distinguish differences among images.34,35 For instance, instead of predefining a certain structure in an image as “benign,” CNNs detect differences between benign and malignant cells to distinguish between the two.35 However, with the overall success of CNNs to detect cancerous cells, critics are concerned that AI technology will replace certain healthcare jobs in the near future.36 Although this is a possibility, AI technology is not meant to replace, but simply to augment the skills of a physician.5 Not only does this technology improve overall efficiency and productivity, but its automatic learning ability can mitigate human errors in its assessments.35
Treatment decisions regarding wound care require an assessment of wound characteristics, such as size and site, along with patient-specific factors, including skin type, genetic information, and lifestyle. The ability to create an effective and precise surgical plan in a reasonable amount of time is integral in achieving optimal outcomes for patients. Although the severity of certain injuries is obvious to the human eye, the use of machine intelligence could make this decision in a quicker, more-efficient manner. By pairing wound images to precise measurements of the patient’s body, a thinking machine could predict the percentage of affected tissue.6 In fact, researchers in Italy have already created a medical device to acquire and process wound images through AI algorithms with 94% accuracy.37 This technology could be further applied to predict wound healing time.38 The advancement of this application will enable surgeons to formulate individualized treatment decisions, avoid wound infection, and improve patient care.
Furthermore, AI-assisted evaluation of computed tomography (CT) angiograms, along with other smart imaging techniques, could aid surgeons in the design of surgical flaps. Current technology requires surgeons to analyze 3-dimensional (3D) images in slices, sacrificing time and accuracy in their ability to see the internal structure of the human body.39 Whereas radiologists have to divide CT angiogram images into 64 different slices for accurate prediction, a computer could analyze all slides of a 3D imaging specimen simultaneously.39 Additionally, plastic surgeons generally use the same flap design when a specific flap procedure is performed. However, the quick, integrative thinking abilities of AI could aid surgeons in a design approach that is customizable to each individual patient.
Wound infections on the cutaneous level could spread into the bone tissue to cause progressive inflammation of the bone, leading to osteomyelitis.40 There are many challenges posed during the diagnostic period for osteomyelitis because of the duration that it takes for plain radiographic images to show any sort of osseous lesion.41 For example, a 30%–50% reduction of bone density must occur for radiographic images to detect any change.41 Therefore, more expensive radiographic imaging techniques, such as CT, magnetic resonance imaging, and scintigraphy, have been used for an earlier, more-accurate diagnosis.42 AI-assisted evaluation of radiographic images could minimize the time that it takes for osteomyelitis to be evident on a radiographic image. For instance, radiologists have applied temporal subtraction, an element of computer-aided diag nosis, to enhance interval changes between two radiologic images.12 Instead of requiring up to a 50% reduction in bone density, the integration of this technology could increase the sensitivity of this technique to make an earlier, potentially life-saving diagnosis of osteomyelitis.
AI technology could also be applied outside of the hospital setting, even in the patient’s own home. For example, diabetic patients now have the option of managing their glucose levels and insulin intake with the use of one simple device.43 From this device, the information enters a digital cloud of data, which can be accessed by their primary care physician without the need for a clinic visit.43 This use of telecommunication technology could be applied to the management of burn wounds. The patient could chart a daily photograph of the wound into the system. These images could then be processed for changes by the computer, including changes in size, color, and potentially even spectrophotography, to assess tissue oximetry.6,44 Furthermore, precise monitoring through the use of photographs can be applied to other areas in plastic surgery, especially esthetic surgery. Plastic surgeons could use patient selfies to monitor the progress of a patient’s face after a procedure, providing the surgeon with real-time updates while saving the patient a visit to the clinic.
Craniofacial Surgery: Craniosynostosis
Craniosynostosis is one of the most common craniofacial malformations encountered by plastic surgeons.45 Of the 90 known syndromes with craniosynostosis, approximately 50 have a known genetic basis.45 However, the practice of precision medicine goes beyond the genetic variability among patients. Both genetic and environmental factors are believed to play a role in the pathogenesis of craniosynostosis.46 A major limitation lies in the scarcity of data that are collected on the outcomes of cases involving craniosynostosis. Through more precise data collection, researchers and surgeons will be able to have a better understanding of the genetic and environmental influences of this malformation, ultimately improving the surgical care that patients receive.
Historically, image detection methods, such as radiography and ultrasonography, have been applied to confirm the correct diagnosis of craniosynostosis.47,48 Now, AI techniques could be implemented to integrate the various images to improve and assist in surgical planning. This is particularly true for syndromic conditions, for which recurrent deformities may be more frequent because of the inherent pathophysiology in bone development. Although it has not been applied clinically, this technology has already contributed to a novel model of “precision liver surgery.”49 It has been discussed that liver imaging can determine disease etiology and guide surgical interventions through 3D visualization. Liver imaging can provide surgeons with the anatomy of a liver, including the location of various vessels and lesions.49 Applying a similar technology to give a 3D representation of a patient’s skull, AI technology could improve craniosynostosis surgery to become accustomed to each, individual patient.
Furthermore, plastic surgeons could use ANNs to predict postoperative complications after craniofacial surgery, similar to how ANNs have been used to predict recurrent cardiovascular disease.50 Syndromic craniosynostosis may lead to recurrent cranial abnormalities because bone may continue to grow abnormally despite corrective surgery. In such cases, both AI and precision medicine may improve medical augmentation of surgery to optimize postoperative outcomes. Additionally, image analysis from big data may help to customize cranial remodeling to best accommodate individual children.
Other Applications in Plastic Surgery
The efficient use of anesthesia and postoperative pain management are fundamental factors in optimizing outcomes for surgical patients.51 Although general anesthesia is commonly used in many surgeries, the risks that accompany its use must be considered on a patient-by-patient basis.51 In addition to intraoperative pain management, the relief of postoperative pain has become a vital factor in surgical outcomes.51 Postoperative pain has been recognized as a common reason for wasteful healthcare spending for inadequate management of care transitions, with an estimated spending of $25–$45 billion in 2011.52 Identifying patients who may be at higher risk for visiting the emergency department due to inadequate postoperative pain may reduce wasteful spending by providing better preoperative education. This same technology could be used to identify patients who are at a higher risk for opioid addiction, which is now considered to be an epidemic in the United States.53,54 AI-assisted technology could use precise patient data, such as prior opioid use, family history of addiction, and tolerance, to stratify the risk involved in opioid use for surgical patients.
AI could also be used to aid in postoperative monitoring. The use of telemedicine to communicate with surgeons digitally can save patients’ both time and money.55 Home diagnostics of the patient’s vital signs, in addition to photograph or video documentation of the surgical site, are variables that could be charted readily at home by the patient. The thinking machine could then report these results while highlighting any areas of concern. For instance, the computer could detect changes in the status of closed incisions to monitor for infection.44 Although in-person clinic visits are sometimes required, a daily update from the patient’s own home could provide surgeons with frequent, real-time updates of a patient’s postoperative course.
From the use of intelligent imaging to the integration of big data techniques, the cognitive skills of physicians have become computerized to be quicker, more efficient, and more precise. However, all technological advancements such as AI have critics and limitations, especially in the field of surgery. Although precision medicine and AI are emerging topics in the field of medicine, the clinical examples that are presented in this article are simply to set a primer for its application. Furthermore, for AI algorithms to make accurate predictions, the data that are input must also be accurate and diverse. The collaboration of researchers and surgeons is imperative in the advancement of this technology in medicine. Thus, surgeons must understand the importance of accuracy and diversity within their own data.7 Furthermore, the uncertainty behind the hidden layer of ANNs is a concern that has risen among many researchers. Many refer to the hidden layer of the ANN as the “black box.”7 It is hoped that, as we expand our knowledge of AI, we will have a better understanding of the algorithmic process behind these neural networks.
AI in the form of big data analytics will augment the cognitive decisions of plastic surgeons, whereas AI-assisted surgery will refine their surgical skills, mitigating human error and promoting precision. By advocating for the use of AI in our field, we will truly defend our label as innovators shaping the field of medicine. With more big data and policy focused on implementing various applications of machine intelligence, the future of AI in plastic surgery will be possible.
1. Marr B. Artificial intelligence (AI) in China: the amazing ways Tencent is driving its adoption. 2018.Forbes.
3. Mesko B. The role of artificial intelligence in precision medicine. Expert Rev Precis Med Drug Dev. 2017;2:239–241.
6. Kanevsky J, Corban J, Gaster R, et al. Big data and machine learning in plastic surgery: a new frontier in surgical innovation. Plast Reconstr Surg. 2016;137:890e–897e.
7. Hashimoto DA, Rosman G, Rus D, et al. Artificial intelligence in surgery: promises and perils. Ann Surg. 2018;268:70–76.
8. Ameer F, Singh AK, Kumar S. Evolution of instruments for harvest of the skin grafts. Indian J Plast Surg. 2013;46:28–35.
9. Siemionow M, Agaoglu G. Tissue transplantation in plastic surgery. Clin Plast Surg. 2007;34:251–269, ix.
10. Hosny A, Parmar C, Quackenbush J, et al. Artificial intelligence in radiology. Nat Rev Cancer. 2018;18:500–510.
11. Jha S, Topol EJ. Adapting to artificial intelligence: radiologists and pathologists as information specialists. JAMA. 2016;316:2353–2354.
12. Shiraishi J, Li Q, Appelbaum D, et al. Computer-aided diagnosis and artificial intelligence in clinical imaging. Semin Nucl Med. 2011;41:449–462.
13. Rajanala S, Maymone MBC, Vashi NA. Selfies-living in the era of filtered photographs. JAMA Facial Plast Surg. 2018;20:443–444.
14. Hopp WJ, Li J, Wang G. Big data and the precision medicine revolution. Prod Oper Manag. 2018;27:1647–1664.
15. Mirnezami R, Nicholson J, Darzi A. Preparing for precision medicine. N Engl J Med. 2012;366:489–491.
18. Chute CG, Ullman-Cullere M, Wood GM, et al. Some experiences and opportunities for big data in translational research. Genet Med. 2013;15:802–809.
19. Adler-Milstein J, Holmgren AJ, Kralovec P, et al. Electronic health record adoption in US hospitals: the emergence of a digital “advanced use” divide. J Am Med Inform Assoc. 2017;24:1142–1148.
20. Taft T, Lenert L, Sakaguchi F, et al. Effects of electronic health record use on the exam room communication skills of resident physicians: a randomized within-subjects study. J Am Med Inform Assoc. 2015;22:192–198.
21. Wu PY, Cheng CW, Kaddi CD, et al. Omic and electronic health record big data analytics for precision medicine. IEEE Trans Biomed Eng. 2017;64:263–273.
22. De Momi E, Ferrigno G. Robotic and artificial intelligence for keyhole neurosurgery: the ROBOCAST project, a multi-modal autonomous path planner. Proc Inst Mech Eng H. 2010;224:715–727.
23. Shademan A, Decker RS, Opfermann JD, et al. Supervised autonomous robotic soft tissue surgery. Sci Transl Med. 2016;8:337ra64.
26. Wang D, Khosla A, Gargeya R, et al. Deep learning for identifying metastatic breast cancer. 2016. Available at https://arxiv.org/abs/1606.05718
. Accessed July 17, 2018.
27. Ehteshami Bejnordi B, Veta M, Johannes van Diest P, et al.; The CAMELYON16 Consortium. Diagnostic assessment of deep learning algorithms for detection of lymph node metastases in women with breast cancer. JAMA. 2017;318:2199–2210.
28. de Boer M, van Leeuwen FE, Hauptmann M, et al. Breast implants and the risk of anaplastic large-cell lymphoma in the breast. JAMA Oncol. 2018;4:335–341.
29. Ye X, Shokrollahi K, Rozen WM, et al. Anaplastic large cell lymphoma (ALCL) and breast implants: breaking down the evidence. Mutat Res Rev Mutat Res. 2014;762:123–132.
30. Doren EL, Miranda RN, Selber JC, et al. U.S. epidemiology of breast implant-associated anaplastic large cell lymphoma. Plast Reconstr Surg. 2017;139:1042–1050.
31. Loch-Wilkinson A, Beath KJ, Knight RJW, et al. Breast implant-associated anaplastic large cell lymphoma in Australia and New Zealand: high-surface-area textured implants are associated with increased risk. Plast Reconstr Surg. 2017;140:645–654.
32. Geras KJ, Wolfson S, Shen Y, et al. High-resolution breast cancer screening with multi-view deep convolutional neural networks. 2017. Available at https://arxiv.org/abs/1703.07047
. Accessed July 17, 2018.
33. Arevalo J, González FA, Ramos-Pollán R, et al. Representation learning for mammography mass lesion classification with convolutional neural networks. Comput Methods Programs Biomed. 2016;127:248–257.
34. Parkhi OM, Vedaldi A, Zisserman A. Deep face recognition. Paper presented at: BMVC2015.
37. Farina M, Secco J. Live demonstration: 3D wound detection & tracking system based on artificial intelligence algorithm. Paper presented at: Biomedical Circuits and Systems Conference (BioCAS); 2017; IEEE2017.
38. Robnik-Sikonja M, Cukjati D, Kononenko I. Comprehensible evaluation of prognostic factors and prediction of wound healing. Artif Intell Med. 2003;29:25–38.
39. Hoe JW, Toh KH. A practical guide to reading CT coronary angiograms–how to avoid mistakes when assessing for coronary stenoses. Int J Cardiovasc Imaging. 2007;23:617–633.
40. Groll ME, Woods T, Salcido R. Osteomyelitis: a context for wound management. Adv Skin Wound Care. 2018;31:253–262.
41. Pineda C, Vargas A, Rodríguez AV. Imaging of osteomyelitis: current concepts. Infect Dis Clin North Am. 2006;20:789–825.
42. Pineda C, Espinosa R, Pena A. Radiographic imaging in osteomyelitis: the role of plain radiography, computed tomography, ultrasonography, magnetic resonance imaging, and scintigraphy. Semin Plast Surg. 2009;23(2):80–9.
43. Contreras I, Vehi J. Artificial intelligence for diabetes management and decision support: literature review. J Med Internet Res. 2018;20:e10775.
44. Belle A, Thiagarajan R, Soroushmehr SM, et al. Big data analytics in healthcare. Biomed Res Int. 2015;2015:370194.
45. Jabs EW, Müller U, Li X, et al. A mutation in the homeodomain of the human MSX2 gene in a family affected with autosomal dominant craniosynostosis. Cell. 1993;75:443–450.
46. Barik M, Bajpai M, Das RR, et al. Study of environmental and genetic factors in children with craniosynostosis: a case-control study. J Pediatr Neurosci. 2013;8:89–92.
47. Rozovsky K, Udjus K, Wilson N, et al. Cranial ultrasound as a first-line imaging examination for craniosynostosis. Pediatrics. 2016:2015–2230.
48. Regelsberger J, Delling G, Helmke K, et al. Ultrasound in the diagnosis of craniosynostosis. J Craniofac Surg. 2006;17:623–625; discussion 626.
49. Dong J, Qi X. Liver Imaging in Precision Medicine. EBioMedicine. 2018;32:1–2.
50. De Beule M, Maes E, De Winter O, et al. Artificial neural networks and risk stratification: a promising combination. Math Comput Model. 2007;46:88–94.
51. Mustoe TA, Buck DW, Lalonde DH. The safe management of anesthesia, sedation, and pain in plastic surgery. Plast Reconstr Surg. 2010;126:165e–176e.
52. Kocher KE, Nallamothu BK, Birkmeyer JD, et al. Emergency department visits after surgery are common for Medicare patients, suggesting opportunities to improve care. Health Aff (Millwood). 2013;32:1600–1607.
53. Stanley SS, Hoppe IC, Ciminello FS. Pain control following breast augmentation: a qualitative systematic review. Aesthet Surg J. 2012;32:964–972.
Copyright © 2019 The Authors. Published by Wolters Kluwer Health, Inc. on behalf of The American Society of Plastic Surgeons.
55. Perednia DA, Allen A. Telemedicine technology and clinical applications. JAMA. 1995;273:483–488.