An Improved Adagrad Gradient Descent Optimization Algorithm | IEEE Conference Publication | IEEE Xplore

An Improved Adagrad Gradient Descent Optimization Algorithm


Abstract:

Gradient descent optimization algorithm is very important in deep learning. In order to obtain a more stable convergence process and reduce overfitting in multiple epochs...Show More

Abstract:

Gradient descent optimization algorithm is very important in deep learning. In order to obtain a more stable convergence process and reduce overfitting in multiple epochs, we propose an improved Adagrad gradient descent optimization algorithm in this paper. Our approach is tested both on the Reuters dataset and the IMDB dataset with many gradient descent optimization algorithms. The results show that our approach has a more stable convergence process and can reduce overfitting in multiple epochs.
Date of Conference: 30 November 2018 - 02 December 2018
Date Added to IEEE Xplore: 24 January 2019
ISBN Information:
Conference Location: Xi'an, China

Contact IEEE to Subscribe

References

References is not available for this document.