Loading [MathJax]/extensions/MathMenu.js
MF-DSNN:An Energy-efficient High-performance Multiplication-free Deep Spiking Neural Network Accelerator | IEEE Conference Publication | IEEE Xplore

MF-DSNN:An Energy-efficient High-performance Multiplication-free Deep Spiking Neural Network Accelerator


Abstract:

Inspired by the brain structure, Spiking Neural Networks (SNNs) are computing models communicating and calculating through spikes. SNNs that are well-trained demonstrate ...Show More

Abstract:

Inspired by the brain structure, Spiking Neural Networks (SNNs) are computing models communicating and calculating through spikes. SNNs that are well-trained demonstrate high sparsity in both weight and activation, distributed spatially and temporally. This sparsity presents both opportunities and challenges for high energy efficiency inference computing of SNNs when compared to conventional artificial neural networks (ANNs). Specifically, the high sparsity can significantly reduce inference delay and energy consumption. However, the temporal dimension greatly complicates the design of spiking accelerators. In this paper, we propose a unique solution for sparse spiking neural network acceleration. First, we adopt a temporal coding scheme called FS coding which differs from the rate coding used in traditional SNNs. Our design eliminates the need for multiplication due to the nature of FS coding. Second, we parallelize the computation required for the neuron at each time point to minimize the access of the weight data. Third, we fuse multiple spikes into one new spike to reduce inference delay and energy consumption. Our proposed architecture exhibits better performance and energy efficiency with less cost. Our experiments show that running MobileNet-V2, MF-DSNN achieves 6× to 22× energy efficiency improvements while having an accuracy degradation of less than 0.9% and using less silicon area on the ImageNet dataset compared to state-of-the-art artificial neural network accelerators.
Date of Conference: 11-13 June 2023
Date Added to IEEE Xplore: 07 July 2023
ISBN Information:

ISSN Information:

Conference Location: Hangzhou, China

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.