Skip to Main Content
Computer-aided surgery systems provide visual guidance to surgeons by showing the real-time pose of surgical instruments overlaid on preoperative medical images of the patient. Surgical instrument poses are localized in space using mainly optical tracking systems (OTS) and electromagnetic tracking systems (EMTS). OTS systems require clear line-of-sight, which is difficult to ensure in current overcrowded operating rooms. On the other hand, EMTSs provide less accuracy and suffer from magnetic field distortion in the presence of metal objects. In this paper, we propose a sensor fusion algorithm to compensate for the drawbacks of OTS and EMTS, and achieve robust tracking of surgical instruments. Spatial alignment of OTS and EMTS data will be achieved through a calibration procedure. The proposed sensor fusion algorithm uses an unscented Kalman filter (UKF), an extension of the standard Kalman filter based on a deterministic sampling of nonlinear functions. Quaternion representation for rotations is used to avoid singularities of other parameterizations (e.g., Euler angles). In cases of optical marker occlusion, our algorithm will take advantage of the EMTS to estimate the position of the hidden marker(s) and feed it in the UKF. The fusion algorithm will also keep track of an error matrix between OTS and EMTS measured poses, providing an up-to-date estimate of the electromagnetic distortion. This will be used to correct EMTS measurement before using it to compensate for possible optical line-of-sight occlusions. The proposed sensor fusion (SF) method is compared to a predictive UKF based on optical data (NSF). Our results show that the SF effectively compensates for short marker occlusion (nine samples) providing a continuous estimate of the instrument pose with an error significantly lower than the NSF method, and reaching a clinically acceptable accuracy. Furthermore, the proposed algorithm increases the accuracy of EMTS in the presence of magnetic field distortion.