Skip to Main Content
According to the usual approximation scheme, we extend the spike-rate perceptron to develop a more biologically plausible so-called extended spike-rate perceptron with renewal process inputs, which employs both first and second statistics, i.e. the means, variances and correlations of the synaptic input. We show that such perceptron, even a single neuron, is able to perform complex non-linear tasks like the XOR problem, which is impossible to be solved by traditional single-layer perceptrons. Here such perceptron offers a significant advantage over spike-rate perceptrons, in that it includes a more accurate approximation to synaptic inputs, and that it introduces variance in the error representation. Our purpose is to open up the possibility of carrying out a random computation in neuronal networks.