Skip to Main Content
Information processing in biologically motivated Boolean networks is of interest in recent information theoretic research. One measure to quantify this ability is the well-known mutual information. Using Fourier analysis, we show that canalizing functions maximize mutual information between a single input variable and the outcome of a function with fixed expectation. A similar result can be obtained for the mutual information between a set of input variables and the output. Further, if the expectation of the function is not fixed, we obtain that the mutual information is maximized by a function only dependent on this single variable, i.e., the dictatorship function. We prove our findings for Boolean functions with uniformly distributed as well as product distributed input variables.