Benefits of Projecting Body Surface with AR Technology
In this study, we examined whether AR technology is useful for improved evaluation of the body surface. Both the surgeon’s visual observation and quantitative evaluation by inspection are used to judge improvement in the body surface. The purpose of this study was to provide referential information for the former technique.
In all cases in this study, the body surface contour after the procedure and the ideal postoperative image almost coincided fortunately, so we did not add additional procedures after projection. However, if obvious differences are found in them, it is a valuable information for performing additional procedures.
With the development and cost reduction of 3D printing technology in recent years, many reports have described the use of 3D models for preoperative planning and reference during surgery.8 Using AR technology, however, 3D data can be directly visualized in the surgical field without physical creation of models. Furthermore, it is possible to compare images with the corresponding surgical field in a completely overlapping state. The output of current 3D printers is practically limited to only 1 or several colors. However, because AR technology provides actual referential data in the surgical field, the operator can select the necessary data display methods even during an operation.
Effective Display Method for Comparison of Body Surface
At the beginning of this study, the body surface image was displayed with a single color, and visual information was limited only to the form. Practically, less discomfort was felt when using the 3D image with the photo-based fine texture than when using the single color 3D image, and we could concentrate on evaluation of the form. Many landmarks on the body surface can be delineated by color differences, such as the eyebrows, closed eyelids, and red lips, although their shape is somewhat continuous with the surroundings. When using a single-color display, these landmarks are considered lost.
When a large difference was observed between the actual facial surface and the superimposed simulated image, the difference could be visually recognized through the AR device. When the difference was small, however, it was difficult to recognize. In the present study, it was easier to visually confirm the difference in small deformations by shifting the image laterally rather than superimposing it onto the surgical field.
By devising a more advanced display method, such as extraction of the outline of the 3D image, it may become easier to compare the differences in small deformations even if the model is superimposed on the surgical field. This will be examined in the future (Fig. 9).
Previous reports have described the necessity of devising an image display method in AR systems that allows for navigation of deep organs.9,10 However, even for body surface evaluation, it is desirable to flexibly change the display method for each case in addition to simply superimposing the images.
Will Quantitative Evaluation of the Body Surface using the AR System become Possible?
It is currently possible to quantitatively evaluate improvements in the body surface, even during surgery, by photographing the surgical field with a 3D imaging system.11 However, frequent evaluation is difficult because a person other than the surgeon must perform this evaluation, and about 10–20 minutes is required. If the workflow can be semi-automated and linked with the AR system in the future, the system will be able to quantitatively evaluate changes in the body surface and project them onto the surgical field with visualization.
In this study, we used fiducial markers for registration of the projected images. Although the recognition speed of the markers was good, the accuracy was low, and an average deviation of 3–4 cm occurred between the surgical field and the corresponding image. Although it was possible to manually correct the deviation, such correction should ideally be unnecessary. Possible causes of deviation include the performance of the original registration program, the resolution of the camera built into the device, the processing speed of the device, and similar factors.
Registration can also be performed using an infrared stereo camera and several optical markers, which are often used in conventional navigation systems.5,6,12 Although the accuracy is high, time is required to perform the initial calibration or resetting when the marker is displaced.
Suenaga et al.13 described a markerless registration method that involved recognizing the contour of the teeth. This would serve as an excellent method if accurate stereoscopic information of the teeth can be obtained before the operation. In other reports, the stereoscopic information of the surrounding environment was recognized and aligned using simultaneous localization and mapping technology.14 Still other reports have described acquiring stereoscopic data of the body surface in real time, matching it with data obtained from computed tomography, and aligning it.15 This technique may be more useful than the marker method when targeting the body surface.
The System Easy to Try and Error, and Future Prospects
Initially, the use of AR technology required the independent development of a system including both an AR device and registration program, which required a long time and was high in cost.16 In recent years, program libraries that allow for realization of AR systems have been gradually opened to the public. Devices that respond to or are specialized for the use of AR technology are readily available for individuals or developers.
In this study, we could construct an AR system based on existing devices/free-software and changed the system based on the findings gained in each case. None of the changes were technically sophisticated (e.g., modification of the model display method); however, each change greatly affected the clinical applications of the system. Importantly, these changes can be performed by clinicians and do not require experts.
The present study showed that by simultaneously displaying the body surface image and the bone/tumor image from different inspection sources, each positional relationship could be understood intraoperatively without strict alignment with the surgical field. We obtained useful findings even not directly related to the body surface evaluation because our system can be flexibly changed for each case. In the future, by simultaneously displaying images from different inspection sources, simple navigation of deep organs may become possible using this system.
As a future prospect, if several operators can simultaneously reference and manipulate the AR system by cooperation among AR devices, communication and unity between operators will be promoted. The AR system can express stereoscopic data that are difficult to understand by language and 2D image, and various data such as anatomical charts and inspection data can be used simultaneously. This will help in teaching surgical skills to residents and providing education to medical students.
We devised the AR system for evaluation of improvements of the body surface, which is important for plastic surgery. Further clinical trials are needed to identify problems and continue making improvements. We constructed an AR system that is easy to modify by combining existing devices, free software, and libraries. The present study confirmed that this AR technology is helpful for evaluation of the body surface by several clinical applications. Our findings are not only useful for body surface evaluation but also for effective utilization of AR technology in the field of plastic surgery.
Patients provided written consent for the use of their images.
1. Azuma RT. A survey of augmented reality. Presence Teleop Virt. 1997;6:355–385.
2. Michael B, Henry F, Ohbuchi R. Merging Virtual Objects with the Real World: Seeing Ultrasound Imagery within the Patient. 1992;26.2:Computer graphics. 203–210.
3. Ivan Sutherland. A head-mounted three dimensional display. In: Proceedings of fall joint computer conference, 1968.San Francisco, California.
4. Watanabe E, Watanabe T, Manaka S, et al. Three-dimensional digitizer (neuronavigator): new equipment for computed tomography-guided stereotaxic surgery. Surg Neurol. 1987;27:543–547.
5. Robert AM, Max JZ, Alexander CK, et al. Application of an augmented reality tool for maxillary positioning in orthognathic surgery—a feasibility study. J Craniomaxillofac Surg. 2006;34:478–483.
6. Okamoto T, Onda S, Yanaga K, et al. Clinical application of navigation surgery using augmented reality in the abdominal field. Surg Today. 2015;45:397–406.
7. Ming Z, Gang C, Li L, et al. Effectiveness of a novel augmented reality-based navigation system in treatment of orbital hypertelorism. Ann Plast Surg. 2015;77:662–668.
8. Theodore L, Ahmed M, Peter S, et al. A plastic surgery application in evolution: three-dimensional printing. Plast Reconstr Surg. 2014;133:446–451.
9. Christoph B, Nassir N. Virtual window for improved depth perception in medical AR. In: Proceedings of International Workshop on Augmented Reality environments for Medical Imaging and Computer-aided Surgery (AMI-ARCS), 2006.München, Germany.
10. Hyunseok C, Byunghyun C, Masamune K, et al. An effective visualization technique for depth perception in augmented reality-based surgical navigation. Int J Med Robot. 2016;12:62–72.
11. Koban KC, Shenck TL, Giunta RE, et al. Using mobile 3D scanning systems for objective evaluation of form, volume, and symmetry in plastic surgery: intraoperative scanning and lymphedema assessment. In: Proceedings of the 7th International Conference on 3D Body Scanning Technologies, 2016.Lugano, Switzerland.
12. Li L, Yang J, Chu Y, et al. A novel augmented reality navigation system for endoscopic sinus and skull base surgery: a feasibility study. PLoS One. 2016;11:e0146996.
13. Suenaga H, Tran HH, Liao H, et al. Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: a pilot study. BMC Med Imaging. 2015;15:51.
14. Mahmoud N, Grasa ÓG, Nicolau SA, et al. On-patient see-through augmented reality based on visual SLAM. Int J Comput Assist Radiol Surg. 2017;12:1–11.
15. Kilgus T, Heim E, Haase S, et al. Mobile markerless augmented reality and its application in forensic medicine. Int J Comput Assist Radiol Surg. 2015;10:573–586.
Copyright © 2017 The Authors. Published by Wolters Kluwer Health, Inc. on behalf of the American Society of Plastic Surgeons. All rights reserved.
16. Sielhorst T, Feuerstein M, Navab N, et al. Advanced medical displays: a literature review of augmented reality. J Display Technology. 2008;4:451–467.