Loading [MathJax]/extensions/MathMenu.js
StereoSpike: Depth Learning With a Spiking Neural Network | IEEE Journals & Magazine | IEEE Xplore

StereoSpike: Depth Learning With a Spiking Neural Network


Network architecture of the proposed StereoSpike model: an ultra-light SNN for the dense estimation of depth from event camera data. Single timestep spiking neurons, high...

Abstract:

Depth estimation is an important computer vision task, useful in particular for navigation in autonomous vehicles, or for object manipulation in robotics. Here, we propos...Show More

Abstract:

Depth estimation is an important computer vision task, useful in particular for navigation in autonomous vehicles, or for object manipulation in robotics. Here, we propose to solve it using StereoSpike, an end-to-end neuromorphic approach, combining two event-based cameras and a Spiking Neural Network (SNN) with a modified U-Net-like encoder-decoder architecture. More specifically, we used the Multi Vehicle Stereo Event Camera Dataset (MVSEC). It provides a depth ground-truth, which was used to train StereoSpike in a supervised manner, using surrogate gradient descent. We propose a novel readout paradigm to obtain a dense analog prediction–the depth of each pixel– from the spikes of the decoder. We demonstrate that this architecture generalizes very well, even better than its non-spiking counterparts, leading to near state-of-the-art test accuracy. To the best of our knowledge, it is the first time that such a large-scale regression problem is solved by a fully spiking neural network. Finally, we show that very low firing rates (< 5%) can be obtained via regularization, with a minimal cost in accuracy. This means that StereoSpike could be efficiently implemented on neuromorphic chips, opening the door for low power and real time embedded systems.
Network architecture of the proposed StereoSpike model: an ultra-light SNN for the dense estimation of depth from event camera data. Single timestep spiking neurons, high...
Published in: IEEE Access ( Volume: 10)
Page(s): 127428 - 127439
Date of Publication: 02 December 2022
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.