Secondary Logo

Share this article on:

Effective Application of Mixed Reality Device HoloLens

Simple Manual Alignment of Surgical Field and Holograms

Mitsuno, Daisuke, M.D.; Ueda, Koichi, M.D.; Hirota, Yuka, M.D.; Ogino, Mariko, M.D.

Plastic and Reconstructive Surgery: February 2019 - Volume 143 - Issue 2 - p 647–651
doi: 10.1097/PRS.0000000000005215
Plastic Surgery Focus: Ideas and Innovations
Watch Video

Summary: The technology used to add information to a real visual field is defined as augmented reality technology. Augmented reality technology that can interactively manipulate displayed information is called mixed reality technology. HoloLens from Microsoft, which is a head-mounted mixed reality device released in 2016, can display a precise three-dimensional model stably on the real visual field as hologram. If it is possible to accurately superimpose the position/direction of the hologram in the surgical field, surgical navigation-like use can be expected. However, in HoloLens, there was no such function. The authors devised a method that can align the surgical field and holograms precisely within a short time using a simple manual operation. The mechanism is to match the three points on the hologram to the corresponding marking points of the body surface. By making it possible to arbitrarily select any of the three points as a pivot/axis of the rotational movement of the hologram, alignment by manual operation becomes very easy. The alignment between the surgical field and the hologram was good and thus contributed to intraoperative objective judgment. By using the method of this study, the clinical usefulness of the mixed reality device HoloLens will be expanded.

Osaka, Japan

From the Department of Plastic and Reconstructive Surgery, Osaka Medical College.

Received for publication March 8, 2018; accepted August 2, 2018.

Disclosure: The authors have no financial interest in any of the products or devices mentioned in this article. There was no internal or external financial support for this study.

Supplemental digital content is available for this article. Direct URL citations appear in the text; simply type the URL address into any Web browser to access this content. Clickable links to the material are provided in the HTML text of this article on the Journal’s website (www.PRSJournal.com).

A “Hot Topic Video” by Editor-in-Chief Rod J. Rohrich, M.D., accompanies this article. Go to PRSJournal.com and click on “Plastic Surgery Hot Topics” in the “Digital Media” tab to watch.

Daisuke Mitsuno, M.D., Department of Plastic and Reconstructive Surgery, Osaka Medical College, 2-7 Daigaku-cho, Takatsuki City, Osaka 569-8686, Japan, pla083@osaka-med.ac.jp

The technology used to add information, including three-dimensional models, to a real visual field is defined as augmented reality technology.1 , 2 The application of such studies has flourished in recent years in the medical field, including plastic surgery.3–7 Augmented reality technology that can interactively manipulate displayed information—by adding virtual reality-like8 elements—is called mixed reality technology.2 , 9–11

Various augmented reality/virtual reality/mixed reality devices are already available on the market because of technological innovation and low pricing.2 , 12 , 13 Among them, HoloLens (Microsoft Corp., Redmond, Wash.), which is a head-mounted mixed reality device released in 2016, can display a precise three-dimensional model stably on the real visual field. Although a general purpose (nonmedical) device, there are already many reports of its use in the medical field.14–16 In plastic surgery, Oren et al. reported on its use for improving intraoperative judgment/workflow and also mentioned the possibility of surgical education and remote communication.17

We earlier reported on the clinical use of an augmented reality device (Moverio BT-200; Epson, Tokyo, Japan) in plastic surgery.18 We also introduced HoloLens immediately after it was released in Japan and now actively use it clinically. By referencing the surgical field and the corresponding hologram closer to each other, comparison and intraoperative judgment become easier. Furthermore, if it is possible to accurately superimpose the position/direction of the hologram on the surgical field, surgical navigation-like use (e.g., grasp the positional relationship between the body surface and a deep organ) or application as intraoperative simulation (e.g., more accurate comparison of body contour) can be expected.18

Currently in HoloLens with existing software, it is impractical to align the hologram precisely to the surgical field for two reasons: the reference points for alignment do not exist and the pivot of the rotational movement of the hologram cannot be specified. Thus, we created software that make it possible within a short time using a simple manual operation. In addition, we evaluated its accuracy and tried it clinically.

Back to Top | Article Outline

MATERIALS AND METHODS

Before surgery, three points on the body surface near the surgical field are marked as references for alignment. Body surface data, including the marked points, are acquired using a hand-held three-dimensional imaging system such as VECTRA H1 (Canfield Scientific, Parsippany, N.J.). The data are superimposed onto the body surface data obtained from computed tomographic data obtained previously for diagnosis. Then, holograms such as those of the body surface, bone, and blood vessels are created while the relative positional relation of each data point is maintained.

Back to Top | Article Outline

Mechanism of Alignment

In the given space, the positions and orientations of ABC triangular points (corresponding to three points marked on a patient’s body surface) are then matched with an abc triangle (corresponding to three marked points on the hologram) in three steps (Figs. 1 and 2). The application was created by adding the above algorithm to basic hologram display programs as contained in the official library for HoloLens application development.

Fig. 1

Fig. 1

Fig. 2

Fig. 2

  1. In the given space, move the whole abc and match point a on the hologram to point A on the patient.
  2. With point a (= A) as the pivot, rotate abc and match point b to point B.
  3. With the edge a-b as the axis, rotate abc and match point c to point C.

After alignment is completed, “switching/viewing” of holograms is performed as needed for the operator’s purpose (Fig. 3). To evaluate the required time for alignment and the margin of error, we used a simple facial phantom and two three-layer models that we developed.19

Fig. 3

Fig. 3

Back to Top | Article Outline

RESULTS

The results are shown in Table 1. The mean time required for alignment was 45.89 seconds, and the mean error was 2.98 mm. The alignment between the surgical field and the hologram was accurate anatomically and thus contributed to intraoperative objective judgment. (See Video, Supplemental Digital Content 1, which shows the mechanism of alignment, alignment of hologram with the phantom, and then alignment with the actual surgical field, available in the “Related Videos” section of the full-text article on PRSJournal.com or, for Ovid users, at http://links.lww.com/PRS/D277.)

Table 1

Table 1

Video

Video

Back to Top | Article Outline

DISCUSSION

Spatial recognition with the HoloLens is based on simultaneous localization and mapping,20 which indicated that the display stability of holograms with respect to real space is excellent. However, there was no function to detect precisely the shape of a specific part in real space and accurately align the hologram with it.

Although libraries that perform automatic alignment by algorithms for fiducial marker recognition or contour extraction have also been released, they apply loads to the central processing unit of the HoloLens, slightly impairing its display stability. The manual alignment, however, has the advantage that simultaneous localization and mapping of the HoloLens is not impaired. Even if automatic positioning is possible, cases where alignment errors remain should be assumed. In such cases, a mechanism that can be adjusted with a simple manual operation should be prepared.

The difference between this method and that performed with the existing software is that any of the three marking points can arbitrarily be selected as a pivot/axis of rotational movement of the hologram. This difference makes manual alignment easy.

Although the HoloLens is far more advanced than conventional devices, it is still in development. It seems that higher performance devices will be available in the future.9 It is already at a stage, however, where it can be used in the clinical setting with only a little ingenuity. Thus, it is possible to use it clinically to obtain otherwise difficult-to-determine information. The error for alignment in this study was not much different from the various reports of other augmented reality systems in recent years.21–25 Although it is currently impossible to use as a strict means of navigation whereby the positional relationship between the tip of the instrument moving in real time and the target organ can be grasped, it is sufficiently accurate to apply to navigation-like use or intraoperative simulation to gain assistance with objective judgment.

Back to Top | Article Outline

CONCLUSIONS

We devised a method to align the hologram to the surgical field precisely within a short time using a simple manual operation when using the mixed realty device HoloLens. By using the method, the clinical usefulness of the HoloLens will be expanded.

Back to Top | Article Outline

REFERENCES

1. Azuma RT. A survey of augmented reality. Presence 1997;6:355–385.
2. van Krevelen DWF, Poelman R. A survey of augmented reality technologies, applications and limitations. Int J Virt Reality 2010;9:1–20.
3. Stéphane N, Luc S, Didier M, et al. Augmented reality in laparoscopic surgical oncology. Surg Oncol. 2011;20:189–201.
4. Antonio M, Fabrizio C, Marina C, et al. Augmented reality in neurosurgery: A systematic review. Neurosurg Rev. 2017;40:537–548.
5. Yoon JW, Chen RE, Kim EJ, et al. Augmented reality for the surgeon: Systematic review. Int J Med Robot. 2018;14:e1914.
6. Kim Y, Kim H, Kim YO. Virtual reality and augmented reality in plastic surgery: A review. Arch Plast Surg. 2017;44:179–187.
7. Khor WS, Baker B, Amin K, Chan A, Patel K, Wong J. Augmented and virtual reality in surgery: The digital surgical environment. Applications, limitations and legal pitfalls. Ann Transl Med. 2016;4:454.
8. Burdea GC, Coiffet P. Virtual Reality Technology. 1994.London: Wiley-Interscience.
9. Paul M, Kishino F. A taxonomy of mixed reality visual displays. IEICE Trans Inf Syst. 1994;77:1321–1329.
10. Billinghurst M, Kato H. Billinghurst M, Kato H. Collaborative mixed reality. In: Proceedings of the First International Symposium on Mixed Reality. 1999:Berlin: Springer Verlag; 261–284.
11. Tamura H, Yamamoto H, Katayama A. Mixed reality: Future dreams seen at the border between real and virtual worlds. IEEE Comput Graph Appl. 2001;21:64–70.
12. Azuma R, Baillot Y, Behringer R, Feiner S, Julier S, MacIntyre B. Recent advances in augmented reality. IEEE Comput Graph Appl. 2001;21:34–47.
13. Goldman Sachs Group. Virtual & augmented reality: The next big computing platform? Available at: http://www.goldmansachs.com/our-thinking/pages/virtual-and-augmented-reality-report.html. Accessed January 30, 2018.
14. Lia H, Paulin G, Yeo CT, et al. HoloLens in suturing training. In: Medical Imaging 2018: Image-Guided Procedures, Robotic Interventions, and Modeling. 2018:Bellingham, Wash: International Society for Optics and Photonics; 1057628.
15. Hanna MG, Ahmed I, Nine J, Prajapati S, Pantanowitz L. Augmented reality technology using Microsoft HoloLens in anatomic pathology. Arch Pathol Lab Med. 2018;142:638–644.
16. Kuhlemann I, Kleemann M, Jauer P, Schweikard A, Ernst F. Towards X-ray free endovascular interventions: Using HoloLens for on-line holographic visualisation. Healthc Technol Lett. 2017;4:184–187.
17. Oren MT, Hayeem LR, Aaron L, et al. Mixed reality with HoloLens: Where virtual reality meets augmented reality in the operating room. Plast Reconstr Surg. 2017;140:1066–1070.
18. Mitsuno D, Ueda K, Itamiya T, Nuri T, Otsuki Y. Intraoperative evaluation of body surface improvement by an augmented reality system that a clinician can modify. Plast Reconstr Surg Glob Open 2017;5:e1432.
19. Ueda K, Hirota Y, Mitsuno D, et al. Three-dimensional, computer-assisted, three-layer models of the face. Plast Reconstr Surg. 2018;141:199e–200e.
20. Durrant-Whyte H, Bailey T. Simultaneous localization and mapping: Part I. IEEE Robot Autom Mag. 2006;13:99–110.
21. Badiali G, Ferrari V, Cutolo F, et al. Augmented reality as an aid in maxillofacial surgery: Validation of a wearable system allowing maxillary repositioning. J Craniomaxillofac Surg. 2014;42:1970–1976.
22. Qu M, Hou Y, Xu Y, et al. Precise positioning of an intraoral distractor using augmented reality in patients with hemifacial microsomia. J Craniomaxillofac Surg. 2015;43:106–112.
23. Wang H, Wang F, Leong AP, Xu L, Chen X, Wang Q. Precision insertion of percutaneous sacroiliac screws using a novel augmented reality-based navigation system: A pilot study. Int Orthop. 2016;40:1941–1947.
24. Mahmoud N, Grasa ÓG, Nicolau SA, et al. On-patient see-through augmented reality based on visual SLAM. Int J Comput Assist Radiol Surg. 2017;12:1–11.
25. Kilgus T, Heim E, Haase S, et al. Mobile markerless augmented reality and its application in forensic medicine. Int J Comput Assist Radiol Surg. 2015;10:573–586.

Supplemental Digital Content

Back to Top | Article Outline
©2019American Society of Plastic Surgeons