Loading [MathJax]/extensions/MathMenu.js
Nesterov's accelerated gradient and momentum as approximations to regularised update descent | IEEE Conference Publication | IEEE Xplore

Nesterov's accelerated gradient and momentum as approximations to regularised update descent


Abstract:

We present a unifying framework for adapting the update direction in gradient-based iterative optimization methods. As natural special cases we re-derive classical moment...Show More

Abstract:

We present a unifying framework for adapting the update direction in gradient-based iterative optimization methods. As natural special cases we re-derive classical momentum and Nesterov's accelerated gradient method, lending a new intuitive interpretation to the latter algorithm. We show that a new algorithm, which we term Regularised Gradient Descent, can converge more quickly than either Nesterov's algorithm or the classical momentum algorithm.
Date of Conference: 14-19 May 2017
Date Added to IEEE Xplore: 03 July 2017
ISBN Information:
Electronic ISSN: 2161-4407
Conference Location: Anchorage, AK, USA

Contact IEEE to Subscribe

References

References is not available for this document.