By Topic

Asymptotically catastrophic convolutional codes

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)

The minimum distance growth rate of unmerged codewords in a convolutional code is shown to depend upon the minimum average weight per branchw_{0}in the encoder state diagram. An upper bound onw_{0}is obtained for a large class of rate1/2codes which includes many of the best known classes of rate1/2codes. The hound is shown to be tight for short constraint length codes. A class of codes is defined to be asymptotically catastrophic ifw_{0}approaches zero for large constraint lengths. Several classes of rate1/2codes are shown to be asymptotically catastrophic. These include classes containing codes known to have large free distance. It is argued that the free distance alone is not a sufficient criterion to determine a codes performance with either Viterbi or sequential decoding. A code with a low distance growth rate will yield a high bit error probability and will not perform well with truncated Viterbi decoding.

Published in:

Information Theory, IEEE Transactions on  (Volume:26 ,  Issue: 3 )