Skip to Main Content
One of the earliest information theoretic interpretations of neural processing and of cognitive processing is recoding to remove statistical dependency while avoiding or minimizing information loss by such a transformation. Since that time there have been many articles that use entropies and relative entropies (including mutual information) to describe various aspects of information in the brain. However, there are two widespread and well known neurobiological observations that cannot be explained by information measures alone. First, on average binary neuron signaling occurs an order of magnitude below the information optimizing rate. Second, there is a natural form of randomization that is by far the greatest source of noise in neocortical neurons called "quantal synaptic failures". Consideration of information measures in the context of energy efficiency explains both of these observations. From the viewpoint of energy consumption, neural processing is rather expensive. The adult human brain accounts for 20% or more of our total energy use, and in young children energy use by the brain can account for nearly 50% of the caloric intake. Thus, there is compelling motivation to hypothesize that microscopic parameterizations of the nervous system are optimized for energy use.