Skip to main content
Erschienen in: International Journal of Computer Assisted Radiology and Surgery 1/2020

Open Access 11.09.2019 | Original Article

CIGuide: in situ augmented reality laser guidance

verfasst von: Zoltán Bárdosi, Christian Plattner, Yusuf Özbek, Thomas Hofmann, Srdjan Milosavljevic, Volker Schartinger, Wolfgang Freysinger

Erschienen in: International Journal of Computer Assisted Radiology and Surgery | Ausgabe 1/2020

Abstract

Purpose 

A robotic intraoperative laser guidance system with hybrid optic-magnetic tracking for skull base surgery is presented. It provides in situ augmented reality guidance for microscopic interventions at the lateral skull base with minimal mental and workload overhead on surgeons working without a monitor and dedicated pointing tools.

Methods 

Three components were developed: a registration tool (Rhinospider), a hybrid magneto-optic-tracked robotic feedback control scheme and a modified robotic end-effector. Rhinospider optimizes registration of patient and preoperative CT data by excluding user errors in fiducial localization with magnetic tracking. The hybrid controller uses an integrated microscope HD camera for robotic control with a guidance beam shining on a dual plate setup avoiding magnetic field distortions. A robotic needle insertion platform (iSYS Medizintechnik GmbH, Austria) was modified to position a laser beam with high precision in a surgical scene compatible to microscopic surgery.

Results 

System accuracy was evaluated quantitatively at various target positions on a phantom. The accuracy found is 1.2 mm ± 0.5 mm. Errors are primarily due to magnetic tracking. This application accuracy seems suitable for most surgical procedures in the lateral skull base. The system was evaluated quantitatively during a mastoidectomy of an anatomic head specimen and was judged useful by the surgeon.

Conclusion 

A hybrid robotic laser guidance system with direct visual feedback is proposed for navigated drilling and intraoperative structure localization. The system provides visual cues directly on/in the patient anatomy, reducing the standard limitations of AR visualizations like depth perception. The custom- built end-effector for the iSYS robot is transparent to using surgical microscopes and compatible with magnetic tracking. The cadaver experiment showed that guidance was accurate and that the end-effector is unobtrusive. This laser guidance has potential to aid the surgeon in finding the optimal mastoidectomy trajectory in more difficult interventions.
Hinweise

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Introduction

Navigated surgery in the lateral skull base is technologically and surgically challenging due to the complexity and smallness of surgical targets of the temporal bone (like the round window of the cochlea). Surgical microscopes are standardly used in ENT (Ear, Nose and Throat) surgeries, and navigated surgical stereo microscopes are rare [1], due to tedious setups and rather large impact on the surgical workflow. Furthermore, clinical acceptance is limited by the need of additional staff to run the navigation equipment. On top, precision optical tracking frequently suffers from obstructed lines of sight between camera and tracked rigid bodies [2].
Our own 25+ years of experience shows that standard pointer and monitor navigation in the lateral skull base is more desirable than a navigated microscope. An additional display for navigation poses a major mental overhead for the surgeon. Navigating with magnetic tracking might be compatible with surgical stereo microscopes [3] without an extra monitor. Moreover, experience has pointed out the need for a minimal user interface showing only the pertinent information directly in the intraoperative scene. We present a system for visualizing information to the surgeon intraoperatively without a monitor and without dedicated pointing tools.
Such a system can provide minimal information with maximum relevance to the surgeon directly in the anatomy: access path, hit / miss of the target. Similar approaches are known, the ARSys Tricorder [4], or the Probaris system [5], none of which have found widespread use in daily surgical routine, presumably due to the additional constraints introduced to surgical workflow (e.g., eye calibration or wearing AR glasses intraoperatively). When considering attention shift [6, 7] and depth-perception [8], Spatial AR systems have an advantage over conventional displays or see-through AR systems [911]. More recently, microscope and instrument-mounted projectors have been used to provide spatial augmented reality guidance with image projection [1214], or instrument-mounted displays [15] to ease instrument alignment.
Our approach (“CIGuide”, Cochlear Implant Guide) projects a laser beam aligned to the surgical access path and the target in the anatomy as visual cues directly in the surgical area [1618]. Neither extra workflow constraints nor mental loads are placed on the surgeon, giving surgically acceptable accuracy with magnetic tracking.

Methods

In this section, we describe the hardware and software components of the prototype guidance system, the proposed workflow and the CIGuide implementation.

Components

iSYS-1 robot

The system (Fig. 1) is a modified iSYS-1 robot [19], originally certified for needle insertion guidance with real-time fluoroscopic closed loop feedback [20].

Custom end-effector

For laser guidance while using a surgical microscope, a custom-bulit end-effector (Fig. 2, (ACMIT GmbH, Wiener Neustadt, built by Sistro GmbH, Hall i. T., both in Austria) replaces the standard needle guide extension (NGE) of the iSYS-1 robot.
It features robust laser beam guidance positioning and orienting, has minimal magnetic field distortion, (medical grade-1 titanium) and does not affect the microscope’s field of view.
The lower shaft houses a laser emitter (collimated to a 1 mm spot at 15-cm distance, 532 nm, Roithner LaserTechnik GmbH, Austria) and a \(30 \,\hbox {mm} \times 30 \,\hbox {mm}\) borosilicate glass (BK7, S/D 40/20, Dr. Sztatecsny GmbH, Korneuburg, Austria) with a 3-mm-radius circular central gold coating.
The end-effector is designed for sterilization. In a real surgical setting, all but the transparent mirror can be covered in a sterile sleeve to maintain sterility.

Patient registration with Rhinospider

A custom-built sensor assembly (“Rhinospider”, [21]) excludes user errors during rigid body patient-to-tracker registration. The four sensors (5D, \(8 \,\hbox {mm} \times 0.5 \,\hbox {mm}\), cylindrical) are isocentrically mounted in titanium spheres (\(\oslash \) 4 mm) (Fig. 3) and are magnetically tracked (Aurora, Northern Digital Inc., Canada). The asymmetric four sensor/ball combinations allow unique registration. A 6D sensor is used as dynamic reference frame (DRF).
Rhinospider is inserted into the nasopharynx prior to preoperative imaging and serves as a fiducial set both for imaging and magnetic tracking and stays in place until the end of surgery. Its design allows both automated localization in CT images and during tracking. A fully automated workflow eliminates user errors and allows high-accuracy registrations potentially with submillimetric target errors.

Dual plate

The dual plate controller (DPC) aligns the guidance laser with the planned optimal access trajectory in the preoperative CT imagery without affecting the magnetic field. A two-step hybrid magnetic-optical control scheme targeting starts with EM tracking, followed by optical augmented reality tracking in the microscope’s camera view. The DPC components (Fig. 4) enable laser tracking by observing its reflections. It is built from a transparent acrylic glass (\(100 \,\hbox {mm} \times 130 \,\hbox {mm} \times 2 \,\hbox {mm}\)) upper plate and an opaque lower plate from PEEK (\(100 \,\hbox {mm} \times 130 \,\hbox {mm} \times 5 \,\hbox {mm}\)) engraved with a 10 x 6 checkerboard pattern (\(10 \,\hbox {mm} \times 10 \,\hbox {mm}\) square size, Lex Feinmechanik GmbH, Grabenstätt, Germany), forming an optical reference frame. The reflections on the plates uniquely determine the laser axis in the tracker volume, allowing optimal alignment with the preoperatively planned axis
Four 5D Aurora sensors at well-defined asymmetric positions relative to the checkerboard pattern form a second reference frame. Both can be registered uniquely (\(\mathcal {T}_{t,p}\)), (see Fig. 5). The checkerboard establishes the 3D to 2D projection between pattern coordinate system and camera view \(\mathcal {T}_{PnP}\), a 3D to 2D homography between pattern reference and image frames [22] which can then be decomposed into translation, rotation and projection.
$$\begin{aligned} \mathcal {T}_{PnP} = \mathcal {T}_\mathrm{proj} \circ \mathcal {T}_{t,r}, \end{aligned}$$
where \(\mathcal {T}_\mathrm{proj}\) is the camera projection transform (including nonlinear distortions) and \(\mathcal {T}_{t,r}\) is the rigid body transformation between the pattern and camera space [23, 24].
For fixed plate and microscope positions, \(\mathcal {T}_{PnP} \circ \mathcal {T}_{t,p}\) bidirectionally connects microscope view and tracker coordinate frames.

CIGuide software

A prototype system (CIGuide) featuring planning, intraoperative navigation and robot control was built. Preoperative planning was implemented as a Slicer [25] module, the rest as a separate set of modules [26], based on open-source libraries [2731]. This planning module is used to transfer the surgical access path as intended by the surgeon to the intraoperative surgical scene. Entry point, target and path, respectively, define the intraoperative path to be visualized by the laser and followed by the surgeon.

Workflow

The most important steps of the workflow are shown in (Fig. 6).

Intraoperative registration with Rhinospider

Intraoperative registration [32] between patient (Rhinospider) and patient’s radiological data requires corresponding pairs of image fiducials and tracker sensors. Rhinospider sensor balls in CT data are detected automatically with a GPU- accelerated (OpenCL, ITK) method [18]. Fifty temporally consistent sensor locations readings are averaged, while the patient maintains a fixed position relative to the tracker. Sensors and fiducials are paired by finding the permutations with minimum fiducial registration error [33]. This registration workflow with standard radiological CT imagery (0.75-mm slice thickness) can reach submillimetric application accuracy in regions close to the sensors such as the cerebello-pontine angle or the lateral skull base [21].

Target designation

At the “Initial Robot Positioning” step, the operator positions the iSYS-1 robot base within the scene and fixes it to the operating table. The laser (viz., the end-effector) is manually positioned at the planned position by directly observing the guidance beam on the patient. The robot has a limited working volume and should have a roughly optimal pose to successfully reach the planned trajectory during robot control. Once fixed, a few reference movements suffice to decide whether the target location is reachable or not. If not, the robot position needs to be improved.
Next, the DPC is placed on the patient and target designation starts: As an aid, a live AR view shows the calculated reflections on both plates where the guidance laser should hit both plates (Fig. 4).
1.
Transform preoperatively planned target axis into pattern space, \(\mathcal {T}_{t,p} \circ \mathcal {T}_{t,i}^{-1}\).
 
2.
Calculate the intersections of planned axis with the two plates, \(t_\mathrm{lp}\) and \(t_\mathrm{up}\).
 
3.
Project intersection points (\(t_\mathrm{lp}\), \(t_\mathrm{up}\)) in the live camera image.
 
4.
Position end-effector such that the laser hits the projected loci on both plates.
 

Robot control

Next, a two-step iterative closed loop feedback controller based on visual feedback moves the robot to the desired location [34].
The two main steps executed by the controller are:
1.
Reference Movement Step: based on visual feedback, the end-effector is moved to a few predefined locations to decide whether a target is reachable from the current position. If not, the robot must be repositioned.
 
2.
Iterative Refinement Step: based on the difference between the observed and desired locations of the reflections, the robot reduces feedback error with a correctional move. This is iterated until the feedback error reaches a predefined threshold, or the maximum number of iterations is reached.
 

Refraction correction

Refraction on the upper plate requires correcting the measured positions on the lower plate before it can be used to determine the true location/orientation of the beam in space (Snell’s law) [18].

Evaluation

Experimental setup

The accuracy of the proposed system was evaluated on a plastic skull phantom with inserted Rhinospider sensors and titanium screws at various locations. For each location, a target axis was defined in the preoperative planning module (Fig. 7). The plastic skull was then positioned randomly in the optimal working volume of the Aurora device so as to allow viewing it with the microscope. To compensate for illumination losses in the stereo microscope, an external high-resolution camera (uEye UI-1245LE, IDS GmbH, Tamron 13VM550ASII CCTV optics, Tamron Europe GmbH, both Germany) was utilized to observe the dual plate. A small, plastic target plate of \(10 \,\hbox {mm} \times 10 \,\hbox {mm}\) was designed to fit the target screw head. (Fig. 8). The plate was 3D printed with a cross-shaped indentation to fit the head of the target screw. During the evaluation runs, the plate was put on top of each target to provide a reference plane.

Evaluation procedure

The whole procedure was repeated ten times for each of the five different targets. At each iteration, two images were captured with different exposure times: one with the laser off (reference image) where the cross indentation is clearly visible, and a measurement image with a short exposure time and with the laser turned on. The measurement image was thresholded, eroded into a few pixels to show the beam’s center and then overlaid as a white pixel layer on the image with normal exposure. For each image target position, the center of the laser spot, top, bottom, left and right endpoints of the engraved cross was marked up manually and used to reconstruct the millimetric displacement error between the center of the cross and the center of the laser spot. The distance of the spot to the target was directly measured on the camera images that were calibrated and undistorted. The center and corner of the target plate and the laser spot center were manually annotated, and the millimeter distance was estimated from the pixel distance using the camera calibration.

Predicting guidance uncertainties at the target

The uncertainty of the approach at a given target including the dual plate location was estimated with a Monte Carlo simulation of probable guidance trajectories, based on [18, 33, 35]. The prediction method is also used to visualize the expected uncertainty of a given setup before the actual robot control was executed (Fig. 9). After experimental evaluation, the validity of these predictions was checked against the measurements.

Cadaver experiment

For evaluating, the guidance system was used for guidance in a cadaver mastoidectomy. First, removing the mandible and the cranial remains of the neck was from an anatomic specimen gave access to the choana from posterior. A Rhinospider assembly with a 6D DRF was fixed (superglue) to the nasal mucosa (Fig. 10). Standard clinical cranial CT images (1-mm slice thickness) were created.
During preoperative planning, the head of the incus, which can easily and precisely be identified in patient and radiological imagery, was designated as a target.
Patient registration was repeated four times with slightly different orientations relative to the magnetic field emitter. The RMS FRE error was \(0.84 \pm 0.13\) mm. The registration was qualitatively validated by touching the implanted 2-mm titanium screws with the navigation system’s probe and checking the error in the CT images (Fig. 11). After successful validation of the registration, the surgeon performed the drilling steps of mastoidectomy with the microscope while the guidance beam was active.

Results

The resulting target accuracies during the quantitative evaluation on the plastic skull are presented in Table 1. The system is able to reach an average target accuracy error of 1.2 mm ± 0.5 mm, which is close to the limit achievable with the magnetic tracking system in use.
The overall predicted target standard deviation of the target accuracy for the tested plate locations was ± 0.52 mm, which corresponds nicely to the measured experimental uncertainty.
System setup including control iterations after patient registration added approximately 15 minutes to the intraoperative workflow. On the cadaver, the target accuracy at the incus bone was subjectively evaluated by the surgeon and the assistant after the drilling step (Fig. 12). The accuracy of the guidance beam at the end of the experiment was estimated to be 2 millimeters. Overall, the surgeon stated that the guidance was not obstructing the view during the mastoidectomy and that it can be helpful for more complicated cases.
Table 1
Measured target accuracies in the plastic skull experiment with targets and their standard deviations in millimeters
Target number
Error and standard deviation in mm
S1
\(1.44 \pm 0.45\)
S2
\(1.24 \pm 0.56\)
S3
\(1.16 \pm 0.55\)
S4
\(1.19 \pm 0.51\)
S5
\(0.95 \pm 0.43\)
\(\bar{S}\)
\(1.2 \pm 0.5\)
S1: inner ear canal, S2: eminentia arcuata, S3: horizontal course of the internal carotid artery, S4: clivus, S5: apex of the orbit. \(\bar{S}\) is the mean value of the other positions

Discussion and conclusions

It is concluded that the magnetic tracking offers an easier approach to intraoperative tracking and user error-free registration without uninterrupted line-of-sight requirements. Sensors are small enough to be positioned close to the relevant surgical structures inside the body. Rhinospider technology is similar to nasal stents (e.g., http://​www.​alaxo.​com/​alaxolito_​eng.​html, Alaxo, Germany) that patients very well tolerate easing nasal breathing in case of nasal congestion and for sports.
Once registration and plate position were determined, a feedback controller utilizes HD camera tracking for robotic laser beam alignment. So the robotic platform inside the magnetic field does not need to be magnetically tracked, which makes the robot design easier. Optical tracking is far more accurate and allows positioning the robot platform far off the region of interest, or even outside the working volume of the tracker.
This hybrid tracking approach enables direct tracking of the guidance laser beam, resulting in significantly better target accuracy than direct magnetic tracking of the robotic end-effector itself. Other designs of control plate and sensor mountings to it could somewhat further reduce the target error. Further phantom experiments with anatomic specimens and surgeons performing “real” interventions are planned to determine the system’s behavior under more realistic conditions.
The system shows a promising potential in our initial tests to be a laser guidance platform that is easy to use, is built from standard elements and can be utilized during surgery without major additional workload on the medical personnel. The system allows in situ visualization of information with a fairly small impact on the surgeon’s mental workload and can easily be integrated into existing operating theatres and workflows.
The CIGuide system as presented builds on Rhinospider technology applied to the nasopharynx. This is no limitation for microscopic interventions at the lateral skull base of all kinds. The insertion of the short-term (less than one day) registration device in the nasopharynx has shown promising guidance in our laboratory investigation. Endoscopic interventions at the anterior skull base, the pituitary, or beyond, are not intended as surgeries to benefit form this technology. The authors are aware of the limits of using one laser beam as a guidance aid. Preliminary work with a second laser to encode the spatial target position as an intuitive surgical visualization is under way; due to the complexity of the issue, it is foreseen to be published separately.

Acknowledgements

Open access funding provided by University of Innsbruck and Medical University of Innsbruck.

Compliance with ethical standards

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors. This article does not contain patient data.
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Unsere Produktempfehlungen

Die Chirurgie

Print-Titel

Das Abo mit mehr Tiefe

Mit der Zeitschrift Die Chirurgie erhalten Sie zusätzlich Online-Zugriff auf weitere 43 chirurgische Fachzeitschriften, CME-Fortbildungen, Webinare, Vorbereitungskursen zur Facharztprüfung und die digitale Enzyklopädie e.Medpedia.

Bis 30. April 2024 bestellen und im ersten Jahr nur 199 € zahlen!

e.Med Interdisziplinär

Kombi-Abonnement

Für Ihren Erfolg in Klinik und Praxis - Die beste Hilfe in Ihrem Arbeitsalltag

Mit e.Med Interdisziplinär erhalten Sie Zugang zu allen CME-Fortbildungen und Fachzeitschriften auf SpringerMedizin.de.

e.Med Radiologie

Kombi-Abonnement

Mit e.Med Radiologie erhalten Sie Zugang zu CME-Fortbildungen des Fachgebietes Radiologie, den Premium-Inhalten der radiologischen Fachzeitschriften, inklusive einer gedruckten Radiologie-Zeitschrift Ihrer Wahl.

Literatur
1.
Zurück zum Zitat Edwards PJ, King AP, Maurer CR Jr, de Cunha DA, Hawkes DJ, Gaston RP, Fenlon MR, Jusczyzck A, Strong AJ, Chandler CL, Gleeson MJ (2000) Design and evaluation of a system for microscope-assisted guided interventions (MAGI). IEEE Trans Med Imaging 19(11):1082–1093CrossRef Edwards PJ, King AP, Maurer CR Jr, de Cunha DA, Hawkes DJ, Gaston RP, Fenlon MR, Jusczyzck A, Strong AJ, Chandler CL, Gleeson MJ (2000) Design and evaluation of a system for microscope-assisted guided interventions (MAGI). IEEE Trans Med Imaging 19(11):1082–1093CrossRef
2.
Zurück zum Zitat Caversaccio MD, Freysinger W (2003) Computer-assistance for intraoperative navigation in ENT surgery. Minim Invasive Ther Allied Technol 12(1–2):36–51CrossRef Caversaccio MD, Freysinger W (2003) Computer-assistance for intraoperative navigation in ENT surgery. Minim Invasive Ther Allied Technol 12(1–2):36–51CrossRef
3.
Zurück zum Zitat Kral F, Puschban EJ, Riechelmann H, Freysinger W (2013) Comparison of optical and electromagnetic tracking for navigated lateral skull base surgery. Int J Med Robot Comput Assist Surg 9(2):247–252CrossRef Kral F, Puschban EJ, Riechelmann H, Freysinger W (2013) Comparison of optical and electromagnetic tracking for navigated lateral skull base surgery. Int J Med Robot Comput Assist Surg 9(2):247–252CrossRef
4.
Zurück zum Zitat Goebbels G, Troche K, Braun M, Ivanovic A, Grab A, von Lübtow K, Sader R, Zeilhofer F, Albrecht K, Praxmarer K (2003) ARSyS-Tricorder - Entwicklung eines Augmented Reality Systems für die intraoperative Navigation in der MKG Chirurgie. Fahlbusch, R.; Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie: Zweite Jahrestagung der Deutschen Gesellschaft für Computer- und Roboterassistierte Chirurgie, CURAC. Germany Erlangen-Nürnberg, Nürnberg. http://publica.fraunhofer.de/documents/N-54460.html Goebbels G, Troche K, Braun M, Ivanovic A, Grab A, von Lübtow K, Sader R, Zeilhofer F, Albrecht K, Praxmarer K (2003) ARSyS-Tricorder - Entwicklung eines Augmented Reality Systems für die intraoperative Navigation in der MKG Chirurgie. Fahlbusch, R.; Deutsche Gesellschaft für Computer- und Roboterassistierte Chirurgie: Zweite Jahrestagung der Deutschen Gesellschaft für Computer- und Roboterassistierte Chirurgie, CURAC. Germany Erlangen-Nürnberg, Nürnberg. http://​publica.​fraunhofer.​de/​documents/​N-54460.​html
5.
Zurück zum Zitat Hoppe (2004) Projektorbasierte erweiterte Realität in der rechnergestützten Chirurgie. Ph.D. thesis, Karlsruhe University Hoppe (2004) Projektorbasierte erweiterte Realität in der rechnergestützten Chirurgie. Ph.D. thesis, Karlsruhe University
6.
Zurück zum Zitat Léger É, Drouin S, Louis Colins D, Popa T, Kersten-Oertel M (2017) Quantifying attention shifts in augmented reality image-guided neurosurgery. Healthc Technol Lett 4(5):188–192CrossRef Léger É, Drouin S, Louis Colins D, Popa T, Kersten-Oertel M (2017) Quantifying attention shifts in augmented reality image-guided neurosurgery. Healthc Technol Lett 4(5):188–192CrossRef
7.
Zurück zum Zitat Meola A, Cutolo F, Carbone M, Cagnazzo F, Ferrari M, Ferrari V (2017) Augmented reality in neurosurgery: a systematic review. Neurosurg Rev 40(4):537–548CrossRef Meola A, Cutolo F, Carbone M, Cagnazzo F, Ferrari M, Ferrari V (2017) Augmented reality in neurosurgery: a systematic review. Neurosurg Rev 40(4):537–548CrossRef
8.
Zurück zum Zitat Kytö M, Mäkinen A, Tossavainen T, Oittinen PT (2014) Stereoscopic depth perception in video see-through augmented reality within action space. J Electron Imaging 23(1):011006-1–011006-10CrossRef Kytö M, Mäkinen A, Tossavainen T, Oittinen PT (2014) Stereoscopic depth perception in video see-through augmented reality within action space. J Electron Imaging 23(1):011006-1–011006-10CrossRef
9.
Zurück zum Zitat Mobasheri MH, Johnston M, Syed UM, King D, Darzi A (2015) The uses of smartphones and tablet devices in surgery: a systematic review of the literature. Surgery 158(5):1352–1371CrossRef Mobasheri MH, Johnston M, Syed UM, King D, Darzi A (2015) The uses of smartphones and tablet devices in surgery: a systematic review of the literature. Surgery 158(5):1352–1371CrossRef
10.
Zurück zum Zitat Cutolo F, Meola A, Carbone M, Sinceri S, Cagnazzo F, Denaro E, Esposito N, Ferrari M, Ferrari V (2017) A new head-mounted display-based augmented reality system in neurosurgical oncology: a study on phantom. Comput Assist Surg 22(1):39–53CrossRef Cutolo F, Meola A, Carbone M, Sinceri S, Cagnazzo F, Denaro E, Esposito N, Ferrari M, Ferrari V (2017) A new head-mounted display-based augmented reality system in neurosurgical oncology: a study on phantom. Comput Assist Surg 22(1):39–53CrossRef
11.
Zurück zum Zitat Cabrilo I, Bijlenga P, Schaller K (2014) Augmented reality in the surgery of cerebral arteriovenous malformations: technique assessment and considerations. Acta Neurochir 156(9):1769–1774CrossRef Cabrilo I, Bijlenga P, Schaller K (2014) Augmented reality in the surgery of cerebral arteriovenous malformations: technique assessment and considerations. Acta Neurochir 156(9):1769–1774CrossRef
12.
Zurück zum Zitat Gavaghan K, Oliveira-Santos T, Peterhans M, Reyes M, Kim H, Anderegg S, Weber S (2012) Evaluation of a portable image overlay projector for the visualization of surgical navigation data: phantom studies. Int J CARS 7:547–556CrossRef Gavaghan K, Oliveira-Santos T, Peterhans M, Reyes M, Kim H, Anderegg S, Weber S (2012) Evaluation of a portable image overlay projector for the visualization of surgical navigation data: phantom studies. Int J CARS 7:547–556CrossRef
13.
Zurück zum Zitat Tabrizi LB, Mahwash M (2015) Augmented reality-guided neurosurgery: accuracy and intraoperative application of an image projection technique. J Neurosurg 123:206–211CrossRef Tabrizi LB, Mahwash M (2015) Augmented reality-guided neurosurgery: accuracy and intraoperative application of an image projection technique. J Neurosurg 123:206–211CrossRef
14.
Zurück zum Zitat Zeng B, Meng F, Ding H, Wang G (2017) A surgical robot with augmented reality visualization for stereoelectroencephalography electrode implantation. Int J CARS 12:1355–1368CrossRef Zeng B, Meng F, Ding H, Wang G (2017) A surgical robot with augmented reality visualization for stereoelectroencephalography electrode implantation. Int J CARS 12:1355–1368CrossRef
15.
Zurück zum Zitat Herrlich M, Tavakol P, Black D, Wenig D, Rieder C, Malaka R, Kikinis R (2017) Instrument-mounted displays for reducing cognitive load during surgical navigation. Int J CARS 12:1599–1605CrossRef Herrlich M, Tavakol P, Black D, Wenig D, Rieder C, Malaka R, Kikinis R (2017) Instrument-mounted displays for reducing cognitive load during surgical navigation. Int J CARS 12:1599–1605CrossRef
16.
Zurück zum Zitat Bardosi Z, Plattner C, Özbek Y, Hofmann T, Milosavljevic S, Freysinger W (2017) CIGuide: a hybrid-tracked robotic laser guidance platform for intraoperative guidance. In: CARS 31st international congress and exhibition, Barcelona, June 20–24 Bardosi Z, Plattner C, Özbek Y, Hofmann T, Milosavljevic S, Freysinger W (2017) CIGuide: a hybrid-tracked robotic laser guidance platform for intraoperative guidance. In: CARS 31st international congress and exhibition, Barcelona, June 20–24
17.
Zurück zum Zitat Bardosi Z, Plattner C, Özbek Y, Hofmann T, Milosavljevic S, Freysinger W (2017) CIGuide: an intraoperative hybrid-tracker robotic laser guidance platform. In: 61st Austrian ENT Congress, Wien, September 13–17 Bardosi Z, Plattner C, Özbek Y, Hofmann T, Milosavljevic S, Freysinger W (2017) CIGuide: an intraoperative hybrid-tracker robotic laser guidance platform. In: 61st Austrian ENT Congress, Wien, September 13–17
18.
Zurück zum Zitat Bardosi Z (2018) Predicting accuracies in spatial augmented reality. Ph.D. Thesis, Innsbruck Bardosi Z (2018) Predicting accuracies in spatial augmented reality. Ph.D. Thesis, Innsbruck
19.
Zurück zum Zitat Kettenbach J, Kara L, Toporek G, Fuerst M, Kronreif G (2014) A robotic needle-positioning and guidance system for CT-guided puncture: ex vivo results. Minim Invasive Ther Allied Technol 23:271–278CrossRef Kettenbach J, Kara L, Toporek G, Fuerst M, Kronreif G (2014) A robotic needle-positioning and guidance system for CT-guided puncture: ex vivo results. Minim Invasive Ther Allied Technol 23:271–278CrossRef
20.
Zurück zum Zitat Gabloner T (2015) Closed loop controller design for robot-based needle orientation in radiofrequency ablation. MSc. Thesis, MCI, Mechatronics Mechanical Engineering, Innsbruck, Austria Gabloner T (2015) Closed loop controller design for robot-based needle orientation in radiofrequency ablation. MSc. Thesis, MCI, Mechatronics Mechanical Engineering, Innsbruck, Austria
21.
Zurück zum Zitat Bardosi ZR, Özbek Y, Plattner C, Freysinger W (2013) Auf dem Weg zum Heiligen Gral der 3D-Navigation submillimetrische Anwendungsgenauigkeit im Felsenbein. In: W Freysinger (ed)., 12th annual conference of the German Society for computer- and roboter-assisted surgery (CURAC), pp 155–158 Bardosi ZR, Özbek Y, Plattner C, Freysinger W (2013) Auf dem Weg zum Heiligen Gral der 3D-Navigation submillimetrische Anwendungsgenauigkeit im Felsenbein. In: W Freysinger (ed)., 12th annual conference of the German Society for computer- and roboter-assisted surgery (CURAC), pp 155–158
22.
Zurück zum Zitat Zhang Z (2016) Camera calibration: a personal retrospective. Mach Vis Appl 27(7):963–965CrossRef Zhang Z (2016) Camera calibration: a personal retrospective. Mach Vis Appl 27(7):963–965CrossRef
23.
Zurück zum Zitat Gao XS, Hou XR, Tang J, Chang HF (2003) Complete solution classification for the perspective-three-point problem. IEEE Trans Pattern Anal Mach Intell 25(8):930–943CrossRef Gao XS, Hou XR, Tang J, Chang HF (2003) Complete solution classification for the perspective-three-point problem. IEEE Trans Pattern Anal Mach Intell 25(8):930–943CrossRef
24.
Zurück zum Zitat Lepetit V, Moreno-Noguer F, Fua P (2009) EPnP: an accurate O(n) solution to the Pnp problem. Int J Comput Vis 81:155–166CrossRef Lepetit V, Moreno-Noguer F, Fua P (2009) EPnP: an accurate O(n) solution to the Pnp problem. Int J Comput Vis 81:155–166CrossRef
25.
Zurück zum Zitat Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin J-C, Pujol S, Bauer C, Jennings D, Fennessy FM, Sonka M, Buatti J, Aylward SR, Miller JV, Pieper S, Kikinis R (2012) 3D Slicer as an image computing platform for the quantitative imaging network. Magn Reson Imaging 30(9):1323–1341CrossRef Fedorov A, Beichel R, Kalpathy-Cramer J, Finet J, Fillion-Robin J-C, Pujol S, Bauer C, Jennings D, Fennessy FM, Sonka M, Buatti J, Aylward SR, Miller JV, Pieper S, Kikinis R (2012) 3D Slicer as an image computing platform for the quantitative imaging network. Magn Reson Imaging 30(9):1323–1341CrossRef
26.
Zurück zum Zitat Ganglberger F, Özbek Y, Freysinger W (2013) The use of common toolkits CTK in the computer-assisted surgery—a demo application. In: Proceedings of the 12th annual conference of the German Society for computer—and robot-assisted surgery (CURAC), pp 161–164 Ganglberger F, Özbek Y, Freysinger W (2013) The use of common toolkits CTK in the computer-assisted surgery—a demo application. In: Proceedings of the 12th annual conference of the German Society for computer—and robot-assisted surgery (CURAC), pp 161–164
27.
Zurück zum Zitat Schroeder W, Martin K, Lorensen B (2006) The visualization toolkit, 4th edn. Kitware. ISBN 978-1-930934-19-1 Schroeder W, Martin K, Lorensen B (2006) The visualization toolkit, 4th edn. Kitware. ISBN 978-1-930934-19-1
28.
Zurück zum Zitat Yoo TS, Ackerman MJ, Lorensen WE, Schroeder W, Chalana V, Aylward S, Metaxas D, Whitaker R (2002) Engineering and algorithm design for an image processing API: a technical report on ITK—the insight toolkit. In: J Westwood (ed.), Proceedings medicine meets virtual reality, pp 586–592. IOS Press, Amsterdam Yoo TS, Ackerman MJ, Lorensen WE, Schroeder W, Chalana V, Aylward S, Metaxas D, Whitaker R (2002) Engineering and algorithm design for an image processing API: a technical report on ITK—the insight toolkit. In: J Westwood (ed.), Proceedings medicine meets virtual reality, pp 586–592. IOS Press, Amsterdam
29.
Zurück zum Zitat Enquobahrie A, Cheng P, Gary K, Ibanez L, Gobbi D, Lindseth F, Yaniv Z, Aylward S, Jomier J, Cleary K (2007) The image-guided surgery toolkit IGSTK: an open source C++ software toolkit. J Digit Imaging 20(1):21–33CrossRef Enquobahrie A, Cheng P, Gary K, Ibanez L, Gobbi D, Lindseth F, Yaniv Z, Aylward S, Jomier J, Cleary K (2007) The image-guided surgery toolkit IGSTK: an open source C++ software toolkit. J Digit Imaging 20(1):21–33CrossRef
30.
Zurück zum Zitat Tokuda J, Fischer GS, Papademetris X, Yaniv Z, Ibanez L, Cheng P, Liu H, Blevins J, Arata J, Golby AJ, Kapur T, Pieper S, Burdette EC, Fichtinger G, Tempany CM, Hata N (2009) OpenIGTLink: an open network protocol for image-guided therapy environment. Int J Med Robot 5(4):423–434CrossRef Tokuda J, Fischer GS, Papademetris X, Yaniv Z, Ibanez L, Cheng P, Liu H, Blevins J, Arata J, Golby AJ, Kapur T, Pieper S, Burdette EC, Fichtinger G, Tempany CM, Hata N (2009) OpenIGTLink: an open network protocol for image-guided therapy environment. Int J Med Robot 5(4):423–434CrossRef
31.
Zurück zum Zitat Bradski G (2000) The openCV library. Dr. Dobb’s J Softw Tools 25:120–125 Bradski G (2000) The openCV library. Dr. Dobb’s J Softw Tools 25:120–125
32.
Zurück zum Zitat Horn Berthold KP (1987) Closed-form solution of absolute orientation using unit quaternions. J Opt Soc Am 4:629–642CrossRef Horn Berthold KP (1987) Closed-form solution of absolute orientation using unit quaternions. J Opt Soc Am 4:629–642CrossRef
33.
Zurück zum Zitat Fitzpatrick JM, West JB (2001) The distribution of target registration error in rigid-body point-based registration. IEEE Trans Med Imaging 20(9):917–927CrossRef Fitzpatrick JM, West JB (2001) The distribution of target registration error in rigid-body point-based registration. IEEE Trans Med Imaging 20(9):917–927CrossRef
34.
Zurück zum Zitat Hofmann T (2016) Fusing magnetic and optical sensor data for optimized axis alignment of medicine robot. MSc. Thesis, MCI, Innsbruck, Austria Hofmann T (2016) Fusing magnetic and optical sensor data for optimized axis alignment of medicine robot. MSc. Thesis, MCI, Innsbruck, Austria
Metadaten
Titel
CIGuide: in situ augmented reality laser guidance
verfasst von
Zoltán Bárdosi
Christian Plattner
Yusuf Özbek
Thomas Hofmann
Srdjan Milosavljevic
Volker Schartinger
Wolfgang Freysinger
Publikationsdatum
11.09.2019
Verlag
Springer International Publishing
Erschienen in
International Journal of Computer Assisted Radiology and Surgery / Ausgabe 1/2020
Print ISSN: 1861-6410
Elektronische ISSN: 1861-6429
DOI
https://doi.org/10.1007/s11548-019-02066-1

Weitere Artikel der Ausgabe 1/2020

International Journal of Computer Assisted Radiology and Surgery 1/2020 Zur Ausgabe

Update Radiologie

Bestellen Sie unseren Fach-Newsletter und bleiben Sie gut informiert.