By Topic

Interactive Source Coding for Function Computation in Collocated Networks

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
Nan Ma ; Dept. of Electr. Eng. & Comput. Sci., Univ. of California, Berkeley, CA, USA ; Ishwar, P. ; Gupta, P.

A problem of interactive function computation in a collocated network is studied in a distributed block source coding framework. With the goal of computing samples of a desired function of sources at the sink, the source nodes exchange messages through a sequence of error-free broadcasts. For any function of independent sources, a computable characterization of the set of all feasible message coding rates-the rate region-is derived in terms of single-letter information measures. In the limit as the number of messages tends to infinity, the infinite-message minimum sum rate, viewed as a functional of the joint source probability mass function, is characterized as the least element of a partially ordered family of functionals having certain convex-geometric properties. This characterization leads to a family of lower bounds for the infinite-message minimum sum rate and a simple criterion to test the optimality of any achievable infinite-message sum rate. An iterative algorithm for evaluating the infinite-message minimum sum-rate functional is proposed and is demonstrated through an example of computing the minimum function of three Bernoulli sources. Based on the characterizations of the rate regions, it is shown that when computing symmetric functions of binary sources, the sink will inevitably learn certain additional information that is not demanded in computing the function. This conceptual understanding leads to new improved bounds for the minimum sum rate. The new bounds are shown to be orderwise better than those based on cut-sets as the network scales. The scaling law of the minimum sum rate is explored for different classes of symmetric functions and source parameters.

Published in:

Information Theory, IEEE Transactions on  (Volume:58 ,  Issue: 7 )