We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
An inexact regularized proximal Newton method for nonconvex and nonsmooth optimization.
- Authors
Liu, Ruyu; Pan, Shaohua; Wu, Yuqia; Yang, Xiaoqi
- Abstract
This paper focuses on the minimization of a sum of a twice continuously differentiable function f and a nonsmooth convex function. An inexact regularized proximal Newton method is proposed by an approximation to the Hessian of f involving the ϱ th power of the KKT residual. For ϱ = 0 , we justify the global convergence of the iterate sequence for the KL objective function and its R-linear convergence rate for the KL objective function of exponent 1/2. For ϱ ∈ (0 , 1) , by assuming that cluster points satisfy a locally Hölderian error bound of order q on a second-order stationary point set and a local error bound of order q > 1 + ϱ on the common stationary point set, respectively, we establish the global convergence of the iterate sequence and its superlinear convergence rate with order depending on q and ϱ . A dual semismooth Newton augmented Lagrangian method is also developed for seeking an inexact minimizer of subproblems. Numerical comparisons with two state-of-the-art methods on ℓ 1 -regularized Student's t-regressions, group penalized Student's t-regressions, and nonconvex image restoration confirm the efficiency of the proposed method.
- Subjects
NONSMOOTH optimization; IMAGE reconstruction; CONVEX functions; POINT set theory; DIFFERENTIABLE functions; NEWTON-Raphson method; MATHEMATICAL regularization
- Publication
Computational Optimization & Applications, 2024, Vol 88, Issue 2, p603
- ISSN
0926-6003
- Publication type
Article
- DOI
10.1007/s10589-024-00560-0