Skip Navigation LinksHome > January 2013 - Volume 72 - Issue > Evaluation of a Completely Robotized Neurosurgical Operating...
Neurosurgery:
doi: 10.1227/NEU.0b013e31827235f8
Supervisory-Controlled Systems

Evaluation of a Completely Robotized Neurosurgical Operating Microscope

Kantelhardt, Sven R. MD*; Finke, Markus PhD; Schweikard, Achim PhD; Giese, Alf MD*

Free Access
Article Outline
Collapse Box

Author Information

*Department of Neurosurgery, Johannes Gutenberg-University Mainz, Mainz, Germany

Institute for Robotics and Cognitive Systems, University of Lübeck, Lübeck, Germany

Correspondence: Sven R. Kantelhardt, MD, Department of Neurosurgery, Johannes Gutenberg-University Mainz, Langenbeckstraße 1, 55101 Mainz, Germany. E-mail: sven.kantelhardt@unimedizin-mainz.de

Received June 11, 2012

Accepted August 23, 2012

Collapse Box

Abstract

BACKGROUND: Operating microscopes are essential for most neurosurgical procedures. Modern robot-assisted controls offer new possibilities, combining the advantages of conventional and automated systems.

OBJECTIVE: We evaluated the prototype of a completely robotized operating microscope with an integrated optical coherence tomography module.

METHODS: A standard operating microscope was fitted with motors and control instruments, with the manual control mode and balance preserved. In the robot mode, the microscope was steered by a remote control that could be fixed to a surgical instrument. External encoders and accelerometers tracked microscope movements. The microscope was additionally fitted with an optical coherence tomography-scanning module.

RESULTS: The robotized microscope was tested on model systems. It could be freely positioned, without forcing the surgeon to take the hands from the instruments or avert the eyes from the oculars. Positioning error was about 1 mm, and vibration faded in 1 second. Tracking of microscope movements, combined with an autofocus function, allowed determination of the focus position within the 3-dimensional space. This constituted a second loop of navigation independent from conventional infrared reflector-based techniques. In the robot mode, automated optical coherence tomography scanning of large surface areas was feasible.

CONCLUSION: The prototype of a robotized optical coherence tomography-integrated operating microscope combines the advantages of a conventional manually controlled operating microscope with a remote-controlled positioning aid and a self-navigating microscope system that performs automated positioning tasks such as surface scans. This demonstrates that, in the future, operating microscopes may be used to acquire intraoperative spatial data, volume changes, and structural data of brain or brain tumor tissue.

ABBREVIATIONS: OCT, optical coherence tomography

PC, personal computer

PLC, programmable logic controller

Since the pioneering works of Kurze, Yaşargil, Perneczky and others,1,2 operating microscopes have become an indispensable tool for neurosurgical procedures. Technically, operating microscopes have undergone a striking evolution since their introduction into the neurosurgical operating theater in 1957.1 The first microscopes were fixed to a simple stand that did not allow complex movements of the microscope head as required during neurosurgical procedures. Therefore, between 1967 and 1972, Yaşargil, in cooperation with the Zeiss company, developed a prototype that allowed free movement of the microscope head along 6 axes. The instrument was fitted with magnetic breaks and could be operated by a mouth switch to keep the hands on the surgical field. After incorporation of a beam splitter with a photo/video camera port, Zeiss introduced the microscope to the market under the name NC 1. Further developments included the introduction of an electronic balance control and optimized light sources. Features added more recently include the integration of operating microscopes into navigation systems that now allow tracking of the position of the microscope and image guidance of the trajectory and field of view of the microscope by 3-dimensional (3D) computed tomography (CT) or magnetic resonance imaging data sets.3,4 Furthermore, the adaptation of the optical path of the microscope to accommodate ultraviolet illumination for excitation of protoporphyrin IX fluorescence resulting from the conversion of 5-aminolevulinic acid by glioma cells5 and fluorescence video indocyanine green angiography6 now extends the use of the microscope to first applications in the analysis of tissue properties and function. Experimentally, integration of optical coherence tomography (OCT) into operating microscopes provides a real-time tomographic 3-D image of tissues, which allows analysis of microstructure, light attenuation properties, temperature, and blood flow velocities.7 Several companies commercially offer partially motorized operating microscopes (eg, Zeiss OPMI Neuro/NC4, Leica M520 MC1, or Möller-Wedel HI-R1000) to facilitate easy handling or semiautomated tracking of target positions of navigation-registered instruments. These microscopes allow restricted automated movements along 2 axes by motors integrated into the head of the microscope. In 1993, Zeiss experimentally introduced the MKM system, which represented a robotized arm system that guided several instruments, including a microscope head.

In this article, we describe a prototype of a fully robotized operating microscope. it was important to us to maintain the conventional (automated) balance of the microscope and manual mode of conventional positioning. This would not only allow continuation of surgery in case of a technical problem but also enable the surgeon to take manual control during surgery whenever he/she preferred. Furthermore, this prototype was fitted with an integrated OCT camera for experimental real-time imaging of the 3-D tissue structure. Video image analysis and robot-controlled positioning of the microscope were used for automated OCT scanning of contiguous surface areas such as resection cavities. Here, we demonstrate that robotized operating microscopes may be used to acquire spatial data of surface positions and tissue microstructure.

Back to Top | Article Outline

MATERIALS AND METHODS

Robotized Operating Microscope

The robotized microscope was based on a modification of a commercially available Hi-R 1000 operating microscope (Möller-Wedel GmbH, Wedel, Germany). The microscope moves around 7 axes: 6 that are motorized and 1 that moves passively. Two axes of the microscope head are motorized in the commercially available microscope (DC servomotors 3056 and 2444 fitted with an HFUC14 harmonic drive gear; both Faulhaber Group, Schönaich, Germany), and the other 4 axes are motorized with 3 ST5709 stepper motors and 1 ST5918 stepper motor (both Nanotec Electronic GmbH & Co KG, Pliening, Germany; Figure 1) fitted with gearboxes (harmonic drive, CPU-M and CPU-H series, respectively), reducing the possible movement steps to 0.018° per step or less. This allows free and nearly continuous automated movement along the x, y, and z axes. The conventional balance and manual mode of movement are maintained because the clutches in the motor gearboxes are simultaneously released when the surgeon releases the magnetic breaks of the microscope by pushing a button on the microscope handles. When operated in the robot mode, the hardware components are driven by a programmable logic controller (PLC) embedded in the steering personal computer (PC) of the commercially available microscope model (Beckhoff CX1100, Beckhoff Automation GmbH, Verl, Germany). The PLC not only drives the stepper motor terminals and allows an external steering PC (Intel Core i5) access to information processed by the internal steering PC such as illumination and lens aperture but also ensures smooth accelerations and decelerations of the microscope during automated movements.8

Figure 1
Figure 1
Image Tools
Back to Top | Article Outline
Tracking of Microscope Position and Navigation

Position control of each stepper motor was achieved by a type AD36 external encoder (Hengstler GmbH, Aldingen, Germany), which feeds back the current position of each motorized axis to an external control PC (Intel Core i5) at an accuracy level of 0.0007°. Additionally, a linear accelerometer (LIS3LV02DQ; STMicroelectronics) integrated into the microscope head close to the front lens acted as an external encoder for the servomotors integrated into the standard microscope head. The encoders were not connected to the manual breaks, which allowed the control PC to receive information continuously on the current microscope position even in the manual positioning mode, which overrides the robot control. For the experiments presented here, the microscope was tracked by a conventional infrared reflector-based navigation system (accuTrack; Atracsys, Renens, Swizerland) to compare the 2 modes of determining the microscope position.8

Back to Top | Article Outline
Remote Control

All motors of the microscope may be activated by a remote control that communicates by a Bluetooth module (F2M03ALA, Free2moveM) with an external PC (Intel Core i5) running Windows XP. This PC integrated all information provided by the PLC and accelerators embedded in the microscope using software specially designed for this task by the Institute for Robotics and Cognitive Systems at the University of Lübeck. Additionally, it processed the steering information provided via the remote control and enacted the microscope movements via the PLC. The calculating process (kinematic analysis, forward and inverse kinematics) on which this program was based was described previously.9 The remote control itself was equipped with a joystick for 2-dimensional translation and pivot movements and a button that activated a menu to switch between control of movements, focus, and zoom (Figure 2A).

Figure 2
Figure 2
Image Tools
Back to Top | Article Outline
Visualization of Electronically Generated Images

For superposition of optical and electronic images generated by the external control PC or the navigation system, an image-injection module was used (Light Router LR1000i; Möller-Wedel, Germany). This allowed an overlay of optical images provided by the optical system of the microscope with information processed or generated by the external PC (Figure 2B).

Back to Top | Article Outline
OCT Imaging

The robotized microscope was equipped with a prototype spectral-domain OCT-scanning module that was connected to the camera port of the operating microscope by a specially designed scanning optic. This way, the OCT scanner is focused via the microscope at a working distance of 232 to 290 mm. Axial resolution is 11 μm, and a second mirror provides a second scanning axis, so that a small field in the center of the field of vision of the microscope can be scanned by the OCT module. A LightRouter (Möller-Wedel, Wedel, Germany) integrated into the optical path of the microscope allows injection and overlay of electronic images (from OCT, image guidance, or other sources) in the surgeon’s field of view.8

Back to Top | Article Outline
Models for Experimental Imaging

We have used a porcine cadaver model to simulate a spinal operation that tested the remote control and visual augmentation of the movement control of the robotized microscope using image injection into the light router of the microscope. To simulate applications in cranial surgery, a phantom fitted with a Styrofoam matrix with embedded glass markers simulating a surgical target was used. CT scanning of the phantom was performed with a conventional 16-slice CT scanner (Aquilion; Toshiba Medical Systems), and images at 1.0-mm slice thickness were reconstructed based on the highest-resolution (bone window) algorithm. CT data were used for neuronavigation. Images were exported as .jpeg files.

Back to Top | Article Outline
Automated Scanning of Surface Areas

OCT scanning of larger surface areas was performed by automated movement of the microscope following a grid placed over the area of interest. In each individual OCT scan, the surface was determined by its strong contrast based on the intensity gradient of each A scan.

Then the movement of the microscope to adjacent scan position was determined. Automated tracking of the movements, however, could not be used for this purpose because the positioning error of about 1 mm was considered too large. Recording of the translational and rotatory movements of the microscope allowed only rough prepositioning of the individual scans. A more accurate determination of the translatory motion was based on the microscopic images themselves. Because the resolutions of the operating microscope and the OCT module differed, the OCT data first had to be preprocessed concerning the correct position and scale. Then a so-called iterative closest point algorithm was used to merge the individual scans into a single 3-D model of the scanned volume. This was finally manually adjusted to reduce remaining noise in the merged scan if required.8

Back to Top | Article Outline

RESULTS

Remote Control

The remote control of the completely robotized operating microscope could be attached to a surgical instrument (Figure 2A). It featured a joystick for 2-dimensional movement and a button for switching through a menu to select the active mode. The menu allowed control of pivot movements (focus and working distance maintained; Figure 3A), translational movements (Figure 3B), and zoom (Figure 3C), allowing free positioning of the microscope within a 3-D space.

Figure 3
Figure 3
Image Tools
Back to Top | Article Outline
Orientation During Automated Microscope Movements

Maximum technical velocity of the robotized microscope was 19.5° per second. However, for safety reasons, the velocity during surgical applications was restricted to a maximum of 4° per second. Furthermore, the maximal velocity was coupled to the zoom factor, ranging from 0.25° to 4° per second. Initiation of microscope movements followed a ramp characteristic of velocity. Depending on the duration of the remote activation, movements were accelerated. However, because the orientation of the axis of the joystick changes with the angle of the instrument in use, the movement resulting from joystick activation was difficult for the surgeon to anticipate before the actual movement. Therefore, the vector of a movement initiated by the joystick was injected into the microscope field of view as a red line visualizing the vector of the movement in real time (Figure 2B). When initiated, directions of microscope movements could be corrected by altering the direction of joystick activation without aborting the movement. This allowed the neurosurgeon to follow the movements of the field of view observed through the oculars and to correct the direction and extent of microscope movements as needed. To assist with menu selection and activation, the current mode (translation/pivot/zoom) was also displayed in the field of view.

Back to Top | Article Outline
Accuracy of Automated Microscope Positioning

To test the accuracy and reproducibility of robot-assisted microscope positioning, 4 targets were programmed. The microscope was automatically directed to the consecutive targets, which had to be brought into the center of the microscopic field and focused. This test was repeated 20 times. The maximal aberration from predefined targets was registered. The error of positioning to test targets was < 1 mm. Vibrations of the operating microscope caused by the microscope motors and brakes faded in < 1 second and did not interfere with the workflow.

Back to Top | Article Outline
Safety Issues

The robot control of the microscope motors was designed to allow free manual positioning of the microscope at any time. Activation of one of the break release buttons on the microscope handles released the microscope breaks and immediately interrupted any automated movement and the control of the remote control over the microscope. Furthermore, in case of loss of power or technical errors, clutches in the gearboxes of the motors were immediately unlocked, and the break system and conventional balance of the microscope remained active, which prevented uncontrolled microscope movements. As in standard operating microscopes, manual positioning remained possible by pressing the break release buttons on the microscope handles.

Back to Top | Article Outline
Robot-Assisted Microscope Control in a Laminotomy Model

The robot-assisted microscope was tested by performing a laminotomy and diskectomy in a porcine cadaver model (Figure 4). After manual positioning of the microscope at the operating table, all positioning and movements of the microscope during the procedure were initiated and controlled only by the remote control attached to the surgical suction. All focusing and adjustments of the zoom factor were performed with the remote control. The surgeon at no point had to interrupt the workflow by removing instruments from the operating field or take the view of oculars of the microscope. Control over the robot-assisted microscope by the remote control was found to be intuitive. The need for sequential activation of translational and pivot movements, however, was identified as a potential drawback.

Figure 4
Figure 4
Image Tools
Back to Top | Article Outline
Tracking of Microscope Movements

A robotized operating microscope in concept not only facilitates movements of the microscope but also may use the position and velocity of the robot to acquire intraoperative position and volume data about a surgical target. To test this, we used a cranial phantom containing a Styrofoam matrix and glass ball markers simulating surgical targets. First, the position of the phantom was registered by both conventional image guidance using surface registration with a pointer and then by surface registration using the autofocus position of the microscope. Then a craniotomy was performed, and the Styrofoam matrix was resected layer by layer to expose the surgical targets (glass markers). The progress of surface position changes was recorded by tracking of microscope movements and consecutive autofocus positions. By these means, the autofocus positions could be determined within the 3-D space at any time. These position data were than superimposed on the sagittal, coronal, and transversal reconstructions and a 3-D model of the initial CT scan (Figure 5). This technique constituted a second loop of navigation, which was completely independent from conventional infrared reflector-based techniques.

Figure 5
Figure 5
Image Tools
Back to Top | Article Outline
Automated Scanning of Surface Areas by Microscope-Integrated OCT

Automated movement control in combination with position tracking may conceptually be used for automated scanning of tissue volumes and microstructural analysis of, for example, resection edges or at different stages of surgery (Figure 6). Therefore, we have used the robotized microscope to scan consecutive surface area using a 3-D OCT integrated into the light router of the microscope,7 which scans a square surface area of about 1 mm2 in each scan. From these data based on the light attenuation of scattered light,10 a surface model of the scanned tissue volume was reconstructed. B-scan tomographic images (Figure 6) can be reconstructed to build a 3-D model of the tissue volume (Figure 7). To obtain these tomographic images, it is essential that the optical axis of both the microscope and the OCT scanner maintains a perpendicular angle to the surface of the scanned tissue. The control PC of the prototype microscope was used to suggest microscope positions to facilitate optimal scan angles to cover the regions of interest. Because the exact alignment of the tomographic images cannot be based on the position data of the robotized microscope alone, the images have to overlap to allow image fusion adding the spatial data.9 The virtual areas scanned were injected into the oculars of the microscope, and marker dots were used to propose focus positions for scans of adjacent areas (Figure 7A). Image injection of scan positions and recording of the exact location of the scan position could also be performed when the microscope was moved manually. These data were used to reconstruct a 3-D model of the upper layers of surface tissues (Figure 7B).

Figure 6
Figure 6
Image Tools
Figure 7
Figure 7
Image Tools
Back to Top | Article Outline

DISCUSSION

Robots today have found applications in neurosurgical practice for guiding instruments along predefined trajectories or providing physical guidance during stereotactic procedures in brain11 and in spinal operations.12,13 Furthermore, robotized C arms are somewhat established as tools in intraoperative fluoroscopy.14 Experimental applications include positioning of transcranial magnet stimulation stimulators15 and robotized brain retractors.11

The first robotized operating microscope for neurosurgical applications was the MKM system introduced by Zeiss in 1993. The system consisted of a robot arm holding different tools, including a microscope head. The working radius and the dynamics of the MKM were relatively restricted, so it was mainly applied for procedures of the frontal skull base. In 1995, Giorgi and colleagues16 attempted another robotized solution for an operating microscope. They attached a microscope head (Möller-Wedel VM 500) to an industrial robot arm. The microscope could be directed by a joystick placed where conventional microscope handles are typically located. In a second generation of the device, the same authors integrated 3 synchronized charge-coupled device cameras around the microscopes front lens. This allowed tracking of infrared markers in the surgical field. In this article, the authors proposed the use of accelerometers as a second loop for continuous tracking of the navigation accuracy. As a mode of robot control, the authors discussed the possibility of a mouth-operated joystick to allow an uninterrupted workflow for the surgeons, keeping the hands free for work in the operating field.17 This issue raised by Giorgi and colleagues addresses the important question of how to optimally control the movements of the operating microscope without interrupting the surgical workflow. Although the joystick control used by Giorgi et al or handles in conventional operating microscopes allow good control and intuitive maneuvering, the surgeon is forced to use 1 or both hands to reposition the microscope, which also requires the surgeon to temporarily hand the surgical instruments to the nurse. Possible solutions were offered by mouth switch-operated controls for conventional balanced operating microscopes, already proposed by Yaşargil, footpad switches, and voice controls, which have been successfully implemented as a control system for a robot arm that guided an endoscope.18 Later, a mouth-operated joystick was also developed by Giorgi et al.17 These strategies, however, were associated with some disadvantages. The concepts, modes, and handling of such controls were significantly different from modalities familiar to surgeons used to working with conventional operating microscopes. Voice recognition and voice control of neurosurgical operating microscopes have not yet been applied to commercial operating microscopes.

Exploring solutions for these problems, we developed a wireless remote control using a joystick and a button-controlled menu structure for control of microscope movements, focus, and zoom. The remote control is designed to attach to a surgical aspirator or other instruments, eliminating the need for the surgeon to remove the instrument from the operating field when moving the operating microscope. Furthermore, the control remains hand operated—a mode that is intuitively familiar to neurosurgeons. However, because the axis of the joystick relative to the surgical field changes with the orientation of the instrument to which the control is attached, it is difficult for the surgeon to anticipate the exact vector of a movement initiated by pushing the joystick. Two technical additions were required to allow exact control of the microscope. First, the resulting direction of the microscope from activation of the joystick was visualized by image injection into the visual field of the microscope. Second, the direction of microscope movements could be changed in real time by modulating the direction of the joystick activation, which allowed corrections as the microscope was moving, making the control significantly more intuitive.

Robotizing a conventional balanced operating microscope in contrast to using robot arm systems16,17,19 offers the advantage of taking manual control of the instrument at any time by instantly converting the system to a conventional system. This is not only more convenient for neurosurgeons who may want to use the conventional manual control for gross repositioning of the microscope but also an important safety issue. In case of technical failure or loss of power, automated movements of the microscope can be interrupted at any time by simply taking the microscope handle and activating the break release button. The microscope, in contrast to a robot arm system, remains in a stable balanced position. Robot control resumes only after specific activation. The restriction of the maximum velocity of the robotized microscope movements presents another important safety issue. It not only allows correction and adjustment of movements but also allows the surgeon to keep in track with the moving microscope oculars.

Because the position information from the accelerometers and the external encoders, combined with the zoom and focus position, allows determination of the focus point of the microscope within a 3-D space, robotized microscopes may be used to determine the dynamic changes of the surface of the operating field by continuously acquiring intraoperative spatial data using the autofocus. This may allow tracking of volume changes such as creation of the resection cavity, which again may be used to update external navigation data during resection. Eventually, this would allow visualization of the growing resection cavity on the navigation device. In addition, the position data from the microscope constitute a second loop of navigation and could be used to check the accuracy of the external navigation.

Furthermore, our robotized microscope was able to move to predefined or acquired targets performing complex pattern of movements to reach the target. As demonstrated in our experiments, robotized microscopes may potentially be able to reach targets automatically and automatically scan surface areas. This can, for example, be used for OCT-based tissue analysis by using both the information from the navigation data of the robot and spatial data derived from fusion of acquired images. We previously integrated an OCT scanner in a prototype operating microscope7 and demonstrated its value in differentiating tumor from adjacent brain tissue.10 The applicability of the technique, however, was restricted by the relatively small area of each single OCT scan. Robotization of the microscope now provides the means for consequent and diligent scanning of a complex surface such as a resection cavity. This may lead to systems that perform independent volume and structural analyses of the operating fields.

Back to Top | Article Outline

CONCLUSION

The prototype of a completely robotized OCT-integrated operating microscope presented in this study combines the advantages of a conventional manually controlled and autobalanced operating microscope with a remote-controlled positioning aid and a self-navigating microscope system that performs automated positioning tasks and scanning operations of surface areas. This demonstrates that, in the future, operating microscopes may be used to acquire intraoperative spatial data, volume changes of the operating sites, and structural and possibly functional data of brain and brain tumor tissue. However, the real benefits of the technique have yet to be quantified in the operating room and compared with conventional operating microscopes.

Back to Top | Article Outline
Disclosures

This work has been supported by the “E-Region Schleswig-Holstein Plus” program of the Ministry of Science, Economy and Traffic; the “Innovation Funds Schleswig-Holstein”; and the European Union from the European Stocks of Regional Development. This project was carried out in cooperation with Möller-Wedel GmbH, Wedel, Germany, and IBG Technology. The authors have no personal financial or institutional interest in any of the drugs, materials, or devices described in this article.

Back to Top | Article Outline

REFERENCES

1. Liu CY, Spicer M, Apuzzo ML. The genesis of neurosurgery and the evolution of the neurosurgical operative environment, part II: concepts for future development, 2003 and beyond. Neurosurgery. 2003;52(1):20–33.

2. Grotenhuis JA, Cohen AR. Axel Perneczky: a remembrance. Neurosurgery. 2010;66(6):1036–1038.

3. Kelly PJ, Kall BA, Goerss S, Earnest F 4th. Computer-assisted stereotaxic laser resection of intra-axial brain neoplasms. J Neurosurg. 1986;64(3):427–439.

4. Roessler K, Ungersboeck K, Aichholzer M, et al.. Image-guided neurosurgery comparing a pointer device system with a navigating microscope: a retrospective analysis of 208 cases. Minim Invasive Neurosurg. 1998;41(2):53–57.

5. Stummer W, Stepp H, Möller G, Ehrhardt A, Leonhard M, Reulen HJ. Technical principles for protoporphyrin-IX-fluorescence guided microsurgical resection of malignant glioma tissue. Acta Neurochir (Wien). 1998;140(10):995–1000.

6. Raabe A, Nakaji P, Beck J, et al.. Prospective evaluation of surgical microscope-integrated intraoperative near-infrared indocyanine green videoangiography during aneurysm surgery. J Neurosurg. 2005;103(6):982–989.

7. Giese A, Böhringer HJ, Leppert J, Kantelhardt SR, et al.. Non-invasive intraoperative optical coherence tomography of the resection cavity during surgery of intrinsic brain tumors. Prog Boimed Opt Imaging. 2006;7(1):60782Z.1–60782Z.8.

8. Finke M, Kantelhardt S, Schlaefer A, et al.. Automatic scanning of large tissue areas in neurosurgery using optical coherence tomography. Int J Med Robot. 2012;8(3):327–336.

9. Finke M, Schweikard A. Motorization of a surgical microscope for intra-operative navigation and intuitive control. Int J Med Robot. 2010;6(3):269–280.

10. Böhringer HJ, Boller D, Leppert J, et al.. Time-domain and spectral-domain optical coherence tomography in the analysis of brain tumor tissue. Lasers Surg Med. 2006;38(6):588–597.

11. Louw DF, Fielding T, McBeth PB, Gregoris D, Newhook P, Sutherland GR. Surgical robotics: a review and neurosurgical prototype development. Neurosurgery. 2004;54(3):525–536.

12. Kantelhardt SR, Martinez R, Baerwinkel S, Burger R, Giese A, Rohde V. Perioperative course and accuracy of screw positioning in conventional, open robotic-guided and percutaneous robotic-guided, pedicle screw placement. Eur Spine J. 2011;20(6):860–868.

13. Lieberman IH, Togawa D, Kayanja MM, et al.. Bone-mounted miniature robotic guidance for pedicle screw and translaminar facet screw placement, part I: technical development and a test case result. Neurosurgery. 2006;59(3):641–650.

14. Binder N, Matthäus L, Burgkart R, Schweikard A. A robotic C-arm fluoroscope. Int J Med Robot. 2005;1(3):108–116.

15. Kantelhardt SR, Fadini T, Finke M, et al.. Robot-assisted image-guided transcranial magnetic stimulation for somatotopic mapping of the motor cortex: a clinical pilot study. Acta Neurochir (Wien). 2010;152(2):333–343.

16. Giorgi C, Eisenberg H, Costi G, Gallo E, Garibotto G, Casolino DS. Robot-assisted microscope for neurosurgery. J Image Guid Surg. 1995;1(3):158–163.

17. Giorgi C, Sala R, Riva D, Cossu A, Eisenberg H. Robotics in child neurosurgery. Childs Nerv Syst. 2000;16(10-11):832–834.

18. Wapler M, Bräucker M, Dürr M, Hiller A, Stallkamp J, Urban V. A voice-controlled robotic assistant for neuroendoscopy. Stud Health Technol Inform. 1999;62:384–387.

19. Lauer W, Esser M, Radermacher K. Development of a compact, semi-robotic platform for an electronic surgical microscope [in German]. Biomed Tech (Berl). 2002;47(suppl 1) (pt 1):6–8.

Keywords:

Brain tumor; Neuronavigation; Operating microscope; Optical coherence tomography; Robot-assisted surgery

Copyright © by the Congress of Neurological Surgeons

Login

Article Tools

Images

Share

Search for Similar Articles
You may search for similar articles that contain these same keywords or you may modify the keyword list to augment your search.