By Topic

Asymptotic properties of data compression and suffix trees

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Szpankowski, W. ; Dept. of Comput. Sci., Purdue Univ., West Lafayette, IN, USA

Recently, Wyner and Ziv (see ibid., vol.35, p.1250-8, 1989) have proved that the typical length of a repeated subword found within the first n positions of a stationary ergodic sequence is (1/h) log n in probability where h is the entropy of the alphabet. This finding was used to obtain several insights into certain universal data compression schemes, most notably the Lempel-Ziv data compression algorithm. Wyner and Ziv have also conjectured that their result can be extended to a stronger almost sure convergence. In this paper, we settle this conjecture in the negative in the so called right domain asymptotic, that is, during a dynamic phase of expanding the data base. We prove-under an additional assumption involving mixing conditions-that the length of a typical repeated subword oscillates almost surely (a.s.) between (1/h1)log n and (1/h2)log n where D<h 2<h⩽h1<∞. We also show that the length of the nth block in the Lempel-Ziv parsing algorithm reveals a similar behavior. We relate our findings to some problems on digital trees, namely the asymptotic behavior of a (noncompact) suffix tree built from suffixes of a random sequence. We prove that the height and the shortest feasible path in a suffix tree are typically (1/h2 )log n (a.s.) and (1/h1)log n (a.s.) respectively

Published in:

Information Theory, IEEE Transactions on  (Volume:39 ,  Issue: 5 )