We found a match
Your institution may have rights to this item. Sign in to continue.
- Title
An efficient gradient method with approximately optimal stepsizes based on regularization models for unconstrained optimization.
- Authors
Liu, Zexian; Chu, Wangli; Liu, Hongwei
- Abstract
It is widely accepted that the stepsize is of great significance to gradient method. An efficient gradient method with approximately optimal stepsizes mainly based on regularization models is proposed for unconstrained optimization. More specifically, if the objective function is not close to a quadratic function on the line segment between the current and latest iterates, regularization model is exploited carefully to generate approximately optimal stepsize. Otherwise, quadratic approximation model is used. In addition, when the curvature is non-positive, special regularization model is developed. The convergence of the proposed method is established under some weak conditions. Extensive numerical experiments indicated the proposed method is very promising. Due to the surprising efficiency, we believe that gradient methods with approximately optimal stepsizes can become strong candidates for large-scale unconstrained optimization.
- Subjects
CONJUGATE gradient methods; CURVATURE; MATHEMATICAL regularization
- Publication
RAIRO: Operations Research (2804-7303), 2022, Vol 56, Issue 4, p2403
- ISSN
2804-7303
- Publication type
Article
- DOI
10.1051/ro/2022107