Elsevier

European Urology

Volume 56, Issue 2, August 2009, Pages 332-338
European Urology

Kidney Cancer
Augmented Reality: A New Tool To Improve Surgical Accuracy during Laparoscopic Partial Nephrectomy? Preliminary In Vitro and In Vivo Results

https://doi.org/10.1016/j.eururo.2009.05.017Get rights and content

Abstract

Background

Use of an augmented reality (AR)–based soft tissue navigation system in urologic laparoscopic surgery is an evolving technique.

Objective

To evaluate a novel soft tissue navigation system developed to enhance the surgeon's perception and to provide decision-making guidance directly before initiation of kidney resection for laparoscopic partial nephrectomy (LPN).

Design, setting, and participants

Custom-designed navigation aids, a mobile C-arm capable of cone-beam imaging, and a standard personal computer were used. The feasibility and reproducibility of inside-out tracking principles were evaluated in a porcine model with an artificially created intraparenchymal tumor in vitro. The same algorithm was then incorporated into clinical practice during LPN.

Interventions

Evaluation of a fully automated inside-out tracking system was repeated in exactly the same way for 10 different porcine renal units. Additionally, 10 patients underwent retroperitoneal LPNs under manual AR guidance by one surgeon.

Measurements

The navigation errors and image-acquisition times were determined in vitro. The mean operative time, time to locate the tumor, and positive surgical margin were assessed in vivo.

Results and limitations

The system was able to navigate and superpose the virtually created images and real-time images with an error margin of only 0.5 mm, and fully automated initial image acquisition took 40 ms. The mean operative time was 165 min (range: 135–195 min), and mean time to locate the tumor was 20 min (range: 13–27 min). None of the cases required conversion to open surgery. Definitive histology revealed tumor-free margins in all 10 cases.

Conclusions

This novel AR tracking system proved to be functional with a reasonable margin of error and image-to-image registration time. Mounting the pre- or intraoperative imaging properties on real-time videoendoscopic images in a real-time manner will simplify and increase the precision of laparoscopic procedures.

Introduction

Advances in imaging modalities have led to an increasing knowledge about patients’ individual anatomy. Using this valuable information, computer-assisted surgery promises to facilitate surgical planning and increase surgical precision [1]. It is most widely used in neurosurgery, otolaryngology, and orthopedics, in which the target organs are assumed to be rigid and to have a constant spatial relationship to anatomical landmarks [2]. Especially in soft tissue surgery, organ shift and tissue deformation caused by motion reduce the value of the image data. To overcome the problem of tissue motion, our navigation approach utilizes custom-designed navigation aids that are inserted into the organ and tracked by conventional monocular endoscope. Thus, organ motion is captured in real time during the intervention and compensation can be done inherently.

In this paper, we present our preliminary results with image-guided surgery using a fully automated system on a porcine model as well as the clinical application of the developed AR algorithms during laparoscopic partial nephrectomy (LPN).

Section snippets

Overview

Currently, optical or magnetic tracking systems typically are used for the tracking of surgical instruments and the patient during surgical navigation. In contrast, inside-out tracking only relies on the real-time processing of the endoscopic video; no external tracking systems are involved.

Inside-out tracking follows the endoscope by means of the spatial configuration of custom-developed navigation aids from three-dimensional (3D) imaging data and their two-dimensional (2D) projections in the

In vitro

The system was able to navigate and superpose the virtually created images and real-time images with an error margin of only 0.5 mm (range: 0.2–0.7 mm), and fully automated initial image acquisition took 40 ms (25 pictures per second) with five needles inserted into the kidney. In the case of organ motion during manipulation, the augmented picture with the detected navigation aids could automatically follow the videoendoscopic picture using the navigation aids as landmarks; in this way, shifts

Discussion

In the literature, data about AR-guided minimally invasive urological operations are scarce [9]. Marescaux et al. [10] reported the first real-time AR-assisted laparoscopic adrenalectomy. Mozer et al. [11] then described their AR-guided system for the placement of percutaneous renal access, and Ukimura et al. [12] and Ukimura and Gill [13] reviewed their clinical experience with real-time transrectal US–guided laparoscopic prostatectomy and LPN by means of preoperative CT images (Table 2).

In

Conclusions

We have described a functional and biological model for AR using fully automated tracking with a reasonable error margin and image-to-image registration time. We also managed to create AR in the clinical setting using the same algorithms we described and used in the in vitro stage. Mounting the pre- or intraoperative imaging properties onto the real-time videoendoscopic images in a real-time manner will simplify and increase the precision of laparoscopic procedures.

References (21)

There are more references available in the full text version of this article.

Cited by (171)

View all citing articles on Scopus
View full text