Abstract:
In this study, we used AdaGrad gradient descent method in optimizer for image deep learning, and compare with Adam gradient descent methods. After processing over six tho...Show MoreMetadata
Abstract:
In this study, we used AdaGrad gradient descent method in optimizer for image deep learning, and compare with Adam gradient descent methods. After processing over six thousand huge database of through silicon via images, AdaGrad has shown a fast convergence and less generalization errors than Adam. The results help Artificial Intelligence for making the management of image judgment more accurate and faster.
Date of Conference: 28-30 September 2020
Date Added to IEEE Xplore: 23 November 2020
ISBN Information: