Skip to Main Content
This paper presents an implementation-independent measure of the amount of information processing performed by (part of) an adaptive system which depends on the goal to be performed by the overall system. This new measure gives rise to a theoretical framework under which several classical supervised and unsupervised learning algorithms fall and, additionally, new efficient learning algorithms can be derived. In the context of neural networks, the framework of information theory strives to design neurally inspired structures from which complex functionality should emerge. Yet, classical measures of information have not taken an explicit account of some of the fundamental concepts in brain theory and neural computation, namely that optimal coding depends on the specific task(s) to be solved by the system and that goal orientedness also depends on extracting relevant information from the environment to be able to affect it in the desired way. We present a new information processing measure that takes into account both the extraction of relevant information and the reduction of spurious information for the task to be solved by the system. This measure is implementation-independent and therefore can be used to analyze and design different adaptive systems. Specifically, we show its application for learning perceptrons, decision trees and linear autoencoders.