Abstract:
The accuracy and efficiency of optimization algorithms remain an essential challenge in supervised machine learning. We used gradient descent and Newton's method, along w...Show MoreMetadata
Abstract:
The accuracy and efficiency of optimization algorithms remain an essential challenge in supervised machine learning. We used gradient descent and Newton's method, along with backtracking line search, to determine optimal parameter values in linear regression. The obtained results for both models are accurate and are close to the straightforward solutions using the least squares method. Newton's method with backtracking line search performs better with less iteration steps and execution time, suggesting that this method has a higher tolerance for controlling and selecting learning rate alpha. Optimization algorithms with backtracking line search method can be applied to solve diverse regression problems with high precision and efficiency.
Date of Conference: 28-29 January 2021
Date Added to IEEE Xplore: 05 July 2021
ISBN Information: