Skip to Main Content
Neural network simulations appear to be a recent development. Neural networks, with their remarkable ability to derive meaning from complicated or imprecise data, can be used to extract patterns and detect trends that are too complex to be noticed by either humans or other computer techniques. One of the advantages of the neural network is Adaptive learning, i.e., the ability to learn and perform tasks based on the given data provided for training or the initial experience. The focus of this paper is the implementation of one such learning algorithm known as Forward Only Computation using Analog blocks like analog multiplier, tan-sigmoid function, subtractors, etc. The reasons for using the forward only computation algorithm and the ways it overcomes the problems faced by the EBP (Error Back Propagation) algorithm and other second order algorithms like the LM (Levenberg-Marquardt) algorithm are explained in this paper. The multiplier used here is a double balanced Gilbert multiplier and the activation function used is tan sigmoid function. Another implementation of the tan sigmoid function is incorporated here, wherein it is used as a multiplier having one differential input and one single ended input (given as a bias). The neural network has been simulated using HSPICE for a period of 2ms and the converging of the weights has been observed.