Abstract:
A new technique of global optimization and its applications in particular to neural networks are presented. The algorithm is also compared to other global optimization al...Show MoreMetadata
Abstract:
A new technique of global optimization and its applications in particular to neural networks are presented. The algorithm is also compared to other global optimization algorithms such as the gradient descent method, Monte Carlo method, genetic algorithm and other commercial packages. This new optimization technique proved itself worthy of further study after observing its accuracy of convergence, speed of convergence and ease of use. Some of the advantages of this new optimization technique are given.
Date of Conference: 03-06 June 1996
Date Added to IEEE Xplore: 06 August 2002
Print ISBN:0-7803-3210-5
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Global Optimization ,
- Neural Network ,
- Computation Time ,
- Convergence Rate ,
- Parallelization ,
- Monte Carlo ,
- Increase In Computation Time ,
- Convergence Accuracy ,
- Global Optimization Algorithm ,
- Monte Carlo Simulation ,
- Members Of Population ,
- Random Points ,
- Feasible Set ,
- Random Element ,
- Gray Code ,
- Deep Minimum
Keywords assist with retrieval of results and provide a means to discovering other relevant content. Learn more.
- IEEE Keywords
- Index Terms
- Global Optimization ,
- Neural Network ,
- Computation Time ,
- Convergence Rate ,
- Parallelization ,
- Monte Carlo ,
- Increase In Computation Time ,
- Convergence Accuracy ,
- Global Optimization Algorithm ,
- Monte Carlo Simulation ,
- Members Of Population ,
- Random Points ,
- Feasible Set ,
- Random Element ,
- Gray Code ,
- Deep Minimum