By Topic

Tissue Tracking and Registration for Image-Guided Surgery

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Yip, M.C. ; Electrical and Computer Engineering Department, University of British Columbia, Vancouver, Canada ; Lowe, D.G. ; Salcudean, S.E. ; Rohling, R.N.
more authors

Vision-based tracking of tissue is a key component to enable augmented reality during a surgical operation. Conventional tracking techniques in computer vision rely on identifying strong edge features or distinctive textures in a well-lit environment; however endoscopic tissue images do not have strong edge features, are poorly lit and exhibit a high degree of specular reflection. Therefore, prior work in achieving densely populated 3-D features for describing tissue surface profiles require complex image processing techniques and have been limited in providing stable, long-term tracking or real-time processing. In this paper, we present an integrated framework for accurately tracking tissue in surgical stereo-cameras at real-time speeds. We use a combination of the STAR feature detector and binary robust independent elementary features to acquire salient features that can be persistently tracked at high frame rates. The features are then used to acquire a densely-populated map of the deformations of tissue surface in 3-D. We evaluate the method against popular feature algorithms in in vivo animal study video sequences, and we also apply the proposed method to human partial nephrectomy video sequences. We extend the salient feature framework to support region tracking in order to maintain the spatial correspondence of a tracked region of tissue or a medical image registration to the surrounding tissue. In vitro tissue studies show registration accuracies of 1.3–3.3 mm using a rigid-body transformation method.

Published in:

Medical Imaging, IEEE Transactions on  (Volume:31 ,  Issue: 11 )