Loading [a11y]/accessibility-menu.js
C-DNN: A 24.5-85.8TOPS/W Complementary-Deep-Neural-Network Processor with Heterogeneous CNN/SNN Core Architecture and Forward-Gradient-Based Sparsity Generation | IEEE Conference Publication | IEEE Xplore

C-DNN: A 24.5-85.8TOPS/W Complementary-Deep-Neural-Network Processor with Heterogeneous CNN/SNN Core Architecture and Forward-Gradient-Based Sparsity Generation


Abstract:

Spiking-Neural-Networks (SNNs) have been studied for a long time, and recently have been shown to achieve the same accuracy as Convolutional-Neural-Networks (CNNs). By us...Show More

Abstract:

Spiking-Neural-Networks (SNNs) have been studied for a long time, and recently have been shown to achieve the same accuracy as Convolutional-Neural-Networks (CNNs). By using CNN-to-SNN conversion, SNNs become a promising candidate for ultra-low power Al applications [1]. For example, compared to BNNs or XOR-nets, SNNs provide lower power consumption and higher accuracy [2]. This is because SNNs perform spike-based event-driven operation with high spike sparsity, unlike a CNN's frame-driven operation. Fig. 22.5.1 shows that the energy consumption of a SNN fluctuates up and down along the layers depending on spike sparsity which changes with each layer, whereas a CNN shows comparatively lower variation. Also, SNNs offer low-power training by generating a Forward-Gradient (FG) which is computed as the time difference between a pre-spike and post-spike similar to STDP in a biological neuron [3]. However, SNN accuracy is lower than a CNN, and SNN supervised training, such as back-propagation through time (BPTT), also shows low accuracy. Conversely, CNNs can achieve high accuracy by back-propagation (BP) training, but this requires heavy computation due to iterative BP and gradient generation (GG). CNNs and SNNs have been unique research areas, however, they have complementary advantages and there is a ground-breaking possibility that they can be combined complementarily to perform energy-efficient inference and training with high accuracy.
Date of Conference: 19-23 February 2023
Date Added to IEEE Xplore: 23 March 2023
ISBN Information:

ISSN Information:

Conference Location: San Francisco, CA, USA

References

References is not available for this document.