Skip to Main Content
In order to investigate the cost of employing fix-free codes instead of the well-established prefix-free codes, some upper bounds on the redundancy of the optimal fix-free codes and their difference with the corresponding Huffman codes are presented. Unlike the conventional approach, which is based on some Shannon-like suboptimal codes, we examine the redundancy of a better Huffman-like code. An algorithm is proposed for deriving optimal binary codeword lengths with Kraft-sum [5/8] for a given probability distribution. The existence of a fix-free code with such codelengths is guaranteed by Yekhanin's theorem. It is shown that the redundancy of this fix-free code is, at most, 0.8 bit greater than that of the Huffman code and does not exceed [8/3]-log3 ≅ 1.0817 bits. This upper bound is [4/3]-log[5/3] ≅0.5964 bit less than the best known one. If the [3/4]-conjecture is proved sometime in the future, the presented upper bound on the redundancy of the optimal fix-free codes (i.e., 1.0817) will be improved by only [5/3]-log3 ≅ 0.0817 . In addition to these general bounds, all known upper bounds in terms of the largest, the smallest, and an arbitrary given symbol probability are substantially improved.