Abstract:
A coupling of two distributions PX and PY is a joint distribution PXY with marginal distributions equal to PX and PY. Given marginals PX and PY and a real-valued function...Show MoreMetadata
Abstract:
A coupling of two distributions PX and PY is a joint distribution PXY with marginal distributions equal to PX and PY. Given marginals PX and PY and a real-valued function f of the joint distribution PXY, what is its minimum over all couplings PXY of PX and PY? We study the asymptotics of such coupling problems with different f's and with X and Y replaced by Xn = (X1, . . . , Xn) and Yn = (Y1, . . . , Yn) where Xi and Yi are i.i.d. copies of random variables X and Y with distributions PX and PY, respectively. These include the maximal coupling, minimum distance coupling, maximal guessing coupling, and minimum entropy coupling problems. We characterize the limiting values of these coupling problems as n tends to infinity. We show that they typically converge at least exponentially fast to their limits. Moreover, for the problems of maximal coupling and minimum excess-distance probability coupling, we also characterize (or bound) the optimal convergence rates (exponents). Furthermore, for the maximal guessing coupling problem, we show that it is equivalent to the distribution approximation problem. Therefore, some existing results for the latter problem can be used to derive the asymptotics of the maximal guessing coupling problem. We also study the asymptotics of the maximal guessing coupling problem for two general sources and a generalization of this problem, named the maximal guessing coupling through a channel problem. We apply the preceding results to several new information-theoretic problems, including exact intrinsic randomness, exact resolvability, channel capacity with input distribution constraint, and perfect stealth and secrecy communication.
Published in: IEEE Transactions on Information Theory ( Volume: 65, Issue: 3, March 2019)