Skip to Main Content
Given enough physical constraints, the format of optimal computation may resolve into a rather small set of options, which we call design specifications. Our interest centers on computational problems that are so intensive, relative to the time and energy available, that they can be solved only in a probabilistic fashion. Here we consider just information and energy in one particular computational format, called neural-like (NL), and characterized as massively parallel, analog computation. Within this format, we consider only the design of a single NL element and the nature of its inputs. Importantly, we provide a specific mathematical format of a simple NL element. We consider this format to be minimal and generic and, therefore, extendable to structures composed of several NL compartments. Secondly, the information and energy constraints are linked, via Shannon's entropy, to classical results from mathematical statistics yielding design specifications that go beyond our initial description of a NL element and its inputs. Critically, for a NL element to preserve all of its relevant input-information at minimal energetic cost, it must transform its inputs so as to create and communicate a minimal sufficient statistic. Then, the assumptions associated with producing such a statistic become new design specifications for NL computing.
Date of Conference: 1-6 July 2012