Abstract:
We present a unifying framework for adapting the update direction in gradient-based iterative optimization methods. As natural special cases we re-derive classical moment...Show MoreMetadata
Abstract:
We present a unifying framework for adapting the update direction in gradient-based iterative optimization methods. As natural special cases we re-derive classical momentum and Nesterov's accelerated gradient method, lending a new intuitive interpretation to the latter algorithm. We show that a new algorithm, which we term Regularised Gradient Descent, can converge more quickly than either Nesterov's algorithm or the classical momentum algorithm.
Date of Conference: 14-19 May 2017
Date Added to IEEE Xplore: 03 July 2017
ISBN Information:
Electronic ISSN: 2161-4407