Skip to Main Content
It is well known that the behaviour of a neural network built with classical summing neurons, as in a multilayer perceptron, widely depends on the activation functions of the involved neurons. Many authors have proposed the use of activation functions with some free parameters which should allow one to reduce the size of the network, trading connection complexity with activation function complexity. Since many implementations of neural network are based on digital hardware, performing the selected activation function through a lookup-table (LUT), it could be interesting to study neural networks whose neurons have adaptable LUT-based activation functions. In this way, after learning, the neurons will present arbitrary activation functions which can also be efficiently implemented with digital technologies. In this paper a preliminary study of the adaptive LUT-based neuron (L-neuron) is presented, together with some experimental results on canonical problems.