Comparison of Search Optimization Algorithms in Two-Stage Artificial Neural Network Training for Handwritten Digits Recognition | IEEE Conference Publication | IEEE Xplore

Comparison of Search Optimization Algorithms in Two-Stage Artificial Neural Network Training for Handwritten Digits Recognition


Abstract:

Backpropagation is the most common method of training artificial neural networks in use. However, backpropagation has a tendency to get trapped in locally optimum solutio...Show More

Abstract:

Backpropagation is the most common method of training artificial neural networks in use. However, backpropagation has a tendency to get trapped in locally optimum solutions. This paper compares the ability of Barebones Fireworks Algorithm, Particle Swarm Optimization, and Cooperative Particle Swarm Optimization to improve upon an artificial neural network trained with backpropagation. The learning ability of the search algorithms and the simulations are hindered by the high dimensionality of the artificial neural network. An analysis of the simulation results shows that the Barebones Fireworks Algorithm outperforms the other two algorithms.
Published in: 2020 SoutheastCon
Date of Conference: 28-29 March 2020
Date Added to IEEE Xplore: 13 November 2020
ISBN Information:

ISSN Information:

Conference Location: Raleigh, NC, USA

Contact IEEE to Subscribe

References

References is not available for this document.