By Topic

Data Processing Theorems and the Second Law of Thermodynamics

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Neri Merhav ; Department of Electrical Engineering, Technion—Israel Institute of Technology, Haifa, Israel

We draw relationships between the generalized data processing theorems of Zakai and Ziv (1973 and 1975) and the dynamical version of the second law of thermodynamics, a.k.a. the Boltzmann H-Theorem, which asserts that the Shannon entropy, H(Xt), pertaining to a finite-state Markov process {Xt}, is monotonically nondecreasing as a function of time t, provided that the steady-state distribution of this process is uniform across the state space (which is the case when the process designates an isolated system). It turns out that both the generalized data processing theorems and the Boltzmann H-Theorem can be viewed as special cases of a more general principle concerning the monotonicity (in time) of a certain generalized information measure applied to a Markov process. This gives rise to a new look at the generalized data processing theorem, which suggests to exploit certain degrees of freedom that may lead to better bounds, for a given choice of the convex function that defines the generalized mutual information. Indeed, we demonstrate an example of a certain setup of joint source-channel coding, where this idea yields an improved lower bound on the distortion, relative to both the 1973 Ziv-Zakai lower bound and the lower bound obtained from the ordinary data processing theorem.

Published in:

IEEE Transactions on Information Theory  (Volume:57 ,  Issue: 8 )