Computing Gradient Vector and Jacobian Matrix in Arbitrarily Connected Neural Networks | IEEE Journals & Magazine | IEEE Xplore

Computing Gradient Vector and Jacobian Matrix in Arbitrarily Connected Neural Networks


Abstract:

This paper describes a new algorithm with neuron-by-neuron computation methods for the gradient vector and the Jacobian matrix. The algorithm can handle networks with arb...Show More

Abstract:

This paper describes a new algorithm with neuron-by-neuron computation methods for the gradient vector and the Jacobian matrix. The algorithm can handle networks with arbitrarily connected neurons. The training speed is comparable with the Levenberg-Marquardt algorithm, which is currently considered by many as the fastest algorithm for neural network training. More importantly, it is shown that the computation of the Jacobian, which is required for second-order algorithms, has a similar computation complexity as the computation of the gradient for first-order learning methods. This new algorithm is implemented in the newly developed software, Neural Network Trainer, which has unique capabilities of handling arbitrarily connected networks. These networks with connections across layers can be more efficient than commonly used multilayer perceptron networks.
Published in: IEEE Transactions on Industrial Electronics ( Volume: 55, Issue: 10, October 2008)
Page(s): 3784 - 3790
Date of Publication: 19 August 2008

ISSN Information:


Contact IEEE to Subscribe

References

References is not available for this document.