By Topic

Integrating Deflection Models and Image Feedback for Real-Time Flexible Needle Steering

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Momen Abayazid ; MIRA Institute for Biomedical Technology and Technical Medicine, University of Twente, 7522 NB Enschede, The Netherlands ; Roy J. Roesthuis ; Rob Reilink ; Sarthak Misra

Needle insertion procedures are commonly used for diagnostic and therapeutic purposes. In this paper, an image-guided control system is developed to robotically steer flexible needles with an asymmetric tip. Knowledge about needle deflection is required for accurate steering. Two different models to predict needle deflection are presented. The first is a kinematics-based model, and the second model predicts needle deflection that is based on the mechanics of needle-tissue interaction. Both models predict deflection of needles that undergo multiple bends. The maximum targeting errors of the kinematics-based and the mechanics-based models for 110-mm insertion distance using a φ 0.5-mm needle are 0.8 and 1.7 mm, respectively. The kinematics-based model is used in the proposed image-guided control system. The control system accounts for target motion during the insertion procedure by detecting the target position in each image frame. Five experimental cases are presented to validate the real-time control system using both camera and ultrasound images as feedback. The experimental results show that the targeting errors of camera and ultrasound image-guided steering toward a moving target are 0.35 and 0.42 mm, respectively. The targeting accuracy of the algorithm is sufficient to reach the smallest lesions (φ 2 mm) that can be detected using the state-of-the-art ultrasound imaging systems.

Published in:

IEEE Transactions on Robotics  (Volume:29 ,  Issue: 2 )