Low Power Neuromorphic Analog System Based on Sub-Threshold Current Mode Circuits | IEEE Conference Publication | IEEE Xplore

Low Power Neuromorphic Analog System Based on Sub-Threshold Current Mode Circuits


Abstract:

Hardware implementation of brain-inspired algorithms such as reservoir computing, neural population coding and deep learning (DL) networks is useful for edge computing de...Show More

Abstract:

Hardware implementation of brain-inspired algorithms such as reservoir computing, neural population coding and deep learning (DL) networks is useful for edge computing devices. The need for hardware implementation of neural network algorithms arises from the high resource utilization in form of processing and power requirements, making them difficult to integrate with edge devices. In this paper, we propose a non-spiking four quadrant current mode neuron model that has a generalized design to be used for population coding, echo-state networks (uses reservoir network), and DL networks. The model is implemented in analog domain with transistors in sub-threshold region for low power consumption and simulated using 180nm technology. The proposed neuron model is configurable and versatile in terms of non-linearity, which empowers the design of a system with different neurons having different activation functions. The neuron model is more robust in case of population coding and echo-state networks (ESNs) as we use random device mismatches to our advantage. The proposed model is current input and current output, hence, easily cascaded together to implement deep layers. The system was tested using the classic XOR gate classification problem, exercising 10 hidden neurons with population coding architecture. Further, derived activation functions of the proposed neuron model have been used to build a dynamical system, input controlled oscillator, using ESNs.
Date of Conference: 26-29 May 2019
Date Added to IEEE Xplore: 01 May 2019
Print ISBN:978-1-7281-0397-6
Print ISSN: 2158-1525
Conference Location: Sapporo, Japan

I. Introduction

Neural network algorithms are useful for tasks such as audio-visual classification [1]-[3] and learning dynamic control [4]. Hardware implementation of such learning algorithms can improvise performance in the field of robotics and edge devices. Sub-threshold analog design of such systems leads to efficient power and area characteristics compared to digital design making them suitable for larger architectures and power-crunch areas like edge devices. Neural population coding is inspired from various cortical regions [5]-[10]. By considering the response from an ensemble of neurons [11], classification and regression tasks can be performed. In echo-state networks (ESNs), a reservoir of neurons is used to process temporal data [12]. Moreover, architectures like population coding and ESNs uses random and fixed weights in initial layers, which reduces the amount of memory required to store these weights [13]-[15], hence, making them more hardware friendly. Several deep learning architectures have also evolved over a period of time [16], [17] and different variations of these have been proposed to make them more reliable and efficient [18]–[22]. To cater to these evolving architectures, we propose a hardware model of the neuron, which can be generalized and adapted to variations in architectures. Various works on neuron models [23]–[27] exist. Also, there are existing works which utilize random device matches for random and fixed weights in population coding [13], [14]. Our design is a four quadrant current mode, which can be cascaded together for deep learning architectures. The activation function of the proposed neuron model approximates the ‘tanh’ curve, and can be controlled. This imparts flexibility to our proposed model which helps the architecture to learn better, especially in the case of population coding and ESNs, where randomness arising from only device mismatches may not be enough.

Contact IEEE to Subscribe

References

References is not available for this document.