Abstract:
The connectivity structure of neural networks has significant implications for neural information processing, and much experimental effort has been made to clarify the st...Show MoreMetadata
Abstract:
The connectivity structure of neural networks has significant implications for neural information processing, and much experimental effort has been made to clarify the structure of neural networks in the brain, i.e., both graph structure and weight structure of synaptic connections. A traditional view of neural information processing suggests that neurons compute in a highly parallel and distributed manner, in which the cooperation of many weak synaptic inputs is necessary to activate a single neuron. Recent experiments, however, have shown that not all synapses are weak in cortical circuits, but some synapses are extremely strong (several tens of times larger than the average weight). In fact, the weights of excitatory synapses between cortical excitatory neurons often obey a lognormal distribution with a long tail of strong synapses. Here, we review some of our important and recent works on computation with sparsely distributed synaptic weights and discuss the possible implications of this synaptic principle for neural computation by spiking neurons. We demonstrate that internal noise emerges from long-tailed distributions of synaptic weights to produce stochastic resonance effect in the reverberating synaptic pathways constituted by strong synapses. We show a spike-timing-dependent plasticity rule and other mechanisms that produce such weight distributions. A possible hardware realization of lognormally connected networks is also shown.
Published in: Proceedings of the IEEE ( Volume: 102, Issue: 4, April 2014)