Abstract:
Benefiting from area and power efficiency, memristors enable the development of neural network analog-to-digital converter (ADC) to break through the limitations of conve...Show MoreMetadata
Abstract:
Benefiting from area and power efficiency, memristors enable the development of neural network analog-to-digital converter (ADC) to break through the limitations of conventional ADCs. Although some memristive ADC (mADC) architectures have been proposed recently, the current research is still at an early stage, which is mainly on the simulation and requires numerous target labels to train the synapse weights. In this paper, we propose a pipelined Hopfield neural network mADC architecture and experimentally demonstrate that such mADC has the capability of self-adaptive weight tuning. The proposed training algorithm is an unsupervised method originated from the random weight change (RWC) algorithm, which is modified to reduce the complexity of error feedback circuit to make it more hardware friendly. The synapse matrix could be adapted to the 1T1R crossbar array. For an 8-bit two-stage pipelined mADC, the proposed architecture in the simulation could achieve 7.69 fJ/conv FOM, 7.90 ENOB, 0.1 LSB INL, and 0.1 LSB DNL. And the experimental performance only achieves 1.56 pJ/conv FOM, 7.59 ENOB, 0.21 LSB INL, and 0.29 LSB DNL, which is mainly limited by the comparator’s switching time.
Published in: IEEE Journal on Emerging and Selected Topics in Circuits and Systems ( Volume: 12, Issue: 4, December 2022)
Funding Agency:
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Neural Network ,
- Power Efficiency ,
- Training Algorithm ,
- Feedback Circuit ,
- Hopfield Neural Network ,
- Crossbar Array ,
- Tuning Capability ,
- Synapse Weights ,
- Loss Function ,
- Impedance ,
- Steady State ,
- Gradient Descent ,
- Levels Of Resistance ,
- Random Generation ,
- Resistant Varieties ,
- Multilayer Perceptron ,
- I-V Curves ,
- Dynamic Performance ,
- Gradient Descent Algorithm ,
- Top Electrode ,
- Random Perturbations ,
- Operational Amplifier ,
- Random Signal ,
- Hardware Environment ,
- Texas Instruments ,
- Value Of The Loss Function ,
- Digital Code ,
- Probe Station ,
- Properties Of Devices ,
- Memristive Devices
- Author Keywords
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Neural Network ,
- Power Efficiency ,
- Training Algorithm ,
- Feedback Circuit ,
- Hopfield Neural Network ,
- Crossbar Array ,
- Tuning Capability ,
- Synapse Weights ,
- Loss Function ,
- Impedance ,
- Steady State ,
- Gradient Descent ,
- Levels Of Resistance ,
- Random Generation ,
- Resistant Varieties ,
- Multilayer Perceptron ,
- I-V Curves ,
- Dynamic Performance ,
- Gradient Descent Algorithm ,
- Top Electrode ,
- Random Perturbations ,
- Operational Amplifier ,
- Random Signal ,
- Hardware Environment ,
- Texas Instruments ,
- Value Of The Loss Function ,
- Digital Code ,
- Probe Station ,
- Properties Of Devices ,
- Memristive Devices
- Author Keywords