By Topic

Computation and Estimation of Generalized Entropy Rates for Denumerable Markov Chains

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Ciuperca, G. ; Lab. de Probabilite, Combinatoire et Statistique, Univ. de Lyon, Villeurbanne, France ; Girardin, V. ; Lhote, L.

We study entropy rates of random sequences for general entropy functionals including the classical Shannon and Rényi entropies and the more recent Tsallis and Sharma-Mittal ones. In the first part, we obtain an explicit formula for the entropy rate for a large class of entropy functionals, as soon as the process satisfies a regularity property known in dynamical systems theory as the quasi-power property. Independent and identically distributed sequence of random variables naturally satisfy this property. Markov chains are proven to satisfy it, too, under simple explicit conditions on their transition probabilities. All the entropy rates under study are thus shown to be either infinite or zero except at a threshold where they are equal to Shannon or Rényi entropy rates up to a multiplicative constant. In the second part, we focus on the estimation of the marginal generalized entropy and entropy rate for parametric Markov chains. Estimators with good asymptotic properties are built through a plug-in procedure using a maximum likelihood estimation of the parameter.

Published in:

Information Theory, IEEE Transactions on  (Volume:57 ,  Issue: 7 )