By Topic

Comparative study of stochastic gradient-free algorithms for system optimization

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Chin, D.C. ; Appl. Phys. Lab., Johns Hopkins Univ., Laurel, MD, USA

Stochastic approximation (SA) algorithms can be used in system optimization problems for which only noisy measurements of the system are available and the gradient of the loss function is not. This paper studies three types of SA algorithms in a multivariate Kiefer-Wolfowitz setting, which uses only noisy measurements of the loss function (i.e., no loss function gradient measurements). The algorithms considered are: the standard finite-difference SA (FDSA) and two accelerated algorithms, the random-directions SA (RDSA) and the simultaneous-perturbation SA (SPSA). RDSA and SPSA use randomized gradient approximations based on (generally) far fewer function measurements than FDSA in each iteration. This paper describes the asymptotic error distribution for a class of RDSA algorithms, and compares the RDSA, SPSA, and FDSA algorithms theoretically and numerically. Based on the theoretical and numerical results, SPSA is the preferable algorithm to use.

Published in:

American Control Conference, 1994  (Volume:3 )

Date of Conference:

29 June-1 July 1994