By Topic

Optimizations

Sign In

Full text access may be available.

To access full text, please use your member or institutional sign in.

Formats Non-Member Member
$31 $31
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

3 Author(s)

A method of optimization solves the problem of determination of the minimum (maximum) of an objective (purpose) function U. In general, in case of optimization problems, the global minimum is of interest. Such a point of global minimum is found between the points of local minimum; it can be unique or multiple. This chapter first considers minimization along a direction. The Powell algorithm gives a procedure to determine n conjugate directions without using the matrix 2U(x). Next, the chapter discusses the methods of gradient type, which are characterized by the use of the gradient of the function to be optimized, U(x), and the methods of Newton type using the Hessian matrix 2U(x). Other covered topics are linear programming, numerical methods for problems of convex programming, quadratic programming, dynamic programming, and Pontryagin's principle of maximum. These are followed by numerical examples and some applications.