Skip to Main Content
An "entropy increasing to the maximum" result analogous to the entropic central limit theorem (Barron 1986; Artstein 2004) is obtained in the discrete setting. This involves the thinning operation and a Poisson limit. Monotonic convergence in relative entropy is established for general discrete distributions, while monotonic increase of Shannon entropy is proved for the special class of ultra-log-concave distributions. Overall we extend the parallel between the information-theoretic central limit theorem and law of small numbers explored by Kontoyiannis (2005) and HarremoEumls (2007, 2008, 2009). Ingredients in the proofs include convexity, majorization, and stochastic orders.