Consider the entropy function region for discrete random variables Xi, i ϵ N and partition N into N1 and N2 with 0 ≤ |N1| ≤ |N2|. An entropy function h is called (N1, N2)...Show More
Metadata
Abstract:
Consider the entropy function region for discrete random variables Xi, i ϵ N and partition N into N1 and N2 with 0 ≤ |N1| ≤ |N2|. An entropy function h is called (N1, N2)-symmetrical if for all A, B ⊂ N, h(A) = h(B) whenever |A ∩ N1| = |B ∩N1|, i = 1,2. We prove that for |N1| = 0 or 1, the closure of the (N1, N2)-symmetrical entropy function region is completely characterized by Shannon-type information inequalities. Applications of this work include threshold secret sharing and distributed data storage, where symmetry exists in the structure of the problem.
Let . For a (discrete) random vector , for , let and , where by convention. For a fixed can be regarded as a set function from to with , and hence is called the entropy function of .