Skip to Main Content
A method of optimization solves the problem of determination of the minimum (maximum) of an objective (purpose) function U. In general, in case of optimization problems, the global minimum is of interest. Such a point of global minimum is found between the points of local minimum; it can be unique or multiple. This chapter first considers minimization along a direction. The Powell algorithm gives a procedure to determine n conjugate directions without using the matrix 2U(x). Next, the chapter discusses the methods of gradient type, which are characterized by the use of the gradient of the function to be optimized, U(x), and the methods of Newton type using the Hessian matrix 2U(x). Other covered topics are linear programming, numerical methods for problems of convex programming, quadratic programming, dynamic programming, and Pontryagin's principle of maximum. These are followed by numerical examples and some applications.