Skip to Main Content
Mathematical derivatives can be approximated or calculated by the techniques including symbolic differentiation, divided difference, and automatic differentiation etc. Automatic differentiation (AD) can compute fast and accurate derivatives such as the Jacobian, Hessian matrix and the tensor of the function. One of the most important applications is to improve the optimization algorithms by computing the relevant derivative information efficiently. In this paper, AD algorithms computing the Hessian and tensor terms are given, and their computational complexity is investigated. Furthermore, they are applied to Chebyshev's method, which includes the evaluation of the tensor terms. The experiment results show that AD can be used efficiently in the optimization methods.