By Topic

Approximate inference in hidden Markov models using iterative active state selection

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
C. M. Vithanage ; Dept. of Math., Univ. of Bristol, Bristol, UK ; C. Andrieu ; R. J. Piechocki

The inferential task of computing the marginal posterior probability mass functions of state variables and pairs of consecutive state variables of a hidden Markov model is considered. This can be exactly and efficiently performed using a message passing scheme such as the Bahl-Cocke-Jelinek-Raviv (BCJR) algorithm. We present a novel iterative reduced complexity variation of the BCJR algorithm that uses reduced support approximations for the forward and backward messages, as in the M-BCJR algorithm. Forward/backward message computation is based on the concept of expectation propagation, which results in an algorithm similar to the M-BCJR algorithm with the active state selection criterion being changed from the filtered distribution of state variables to beliefs of state variables. By allowing possibly different supports for the forward and backward messages, we derive identical forward and backward recursions that can be iterated. Simulation results of application for trellis-based equalization of a wireless communication system confirm the improved performance over the M-BCJR algorithm.

Published in:

IEEE Signal Processing Letters  (Volume:13 ,  Issue: 2 )