By Topic

Neural networks with digital LUT activation functions

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)
F. Piazza ; Dipartimento di Elettronica e Autom., Ancona Univ., Italy ; A. Uncini ; M. Zenobi

It is well known that the behaviour of a neural network built with classical summing neurons, as in a multilayer perceptron, widely depends on the activation functions of the involved neurons. Many authors have proposed the use of activation functions with some free parameters which should allow one to reduce the size of the network, trading connection complexity with activation function complexity. Since many implementations of neural network are based on digital hardware, performing the selected activation function through a lookup-table (LUT), it could be interesting to study neural networks whose neurons have adaptable LUT-based activation functions. In this way, after learning, the neurons will present arbitrary activation functions which can also be efficiently implemented with digital technologies. In this paper a preliminary study of the adaptive LUT-based neuron (L-neuron) is presented, together with some experimental results on canonical problems.

Published in:

Neural Networks, 1993. IJCNN '93-Nagoya. Proceedings of 1993 International Joint Conference on  (Volume:2 )

Date of Conference:

25-29 Oct. 1993