Loading [MathJax]/extensions/MathMenu.js
FPGA Realization of Activation Function for Artificial Neural Networks | IEEE Conference Publication | IEEE Xplore

FPGA Realization of Activation Function for Artificial Neural Networks


Abstract:

Implementation of artificial neural network (ANN) in hardware is needed to fully utilize the inherent parallelism. Presented work focuses on the configuration of field-pr...Show More

Abstract:

Implementation of artificial neural network (ANN) in hardware is needed to fully utilize the inherent parallelism. Presented work focuses on the configuration of field-programmable gate array (FPGA) to realize the activation function utilized in ANN. The computation of a nonlinear activation function (AF) is one of the factors that constraint the area or the computation time. The most popular AF is the log-sigmoid function, which has different possibilities of realizing in digital hardware. Equation approximation, lookup table (LUT) based approach and piecewise linear approximation (PWL) are a few to mention. A two-fold approach to optimize the resource requirement is presented here. Primarily, fixed-point computation (FXP) that needs minimal hardware, as against floating-point computation (FLP) is followed. Secondly, the PWL approximation of AF with more precision is proved to consume lesser Si area when compared to LUT based AF. Experimental results are presented for computation.
Date of Conference: 26-28 November 2008
Date Added to IEEE Xplore: 08 December 2008
Print ISBN:978-0-7695-3382-7

ISSN Information:

Conference Location: Kaohsuing, Taiwan

Contact IEEE to Subscribe

References

References is not available for this document.