We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
A First-Order Primal-Dual Method for Nonconvex Constrained Optimization Based on the Augmented Lagrangian.
- Authors
Zhu, Daoli; Zhao, Lei; Zhang, Shuzhong
- Abstract
Nonlinearly constrained nonconvex and nonsmooth optimization models play an increasingly important role in machine learning, statistics, and data analytics. In this paper, based on the augmented Lagrangian function, we introduce a flexible first-order primal-dual method, to be called nonconvex auxiliary problem principle of augmented Lagrangian (NAPP-AL), for solving a class of nonlinearly constrained nonconvex and nonsmooth optimization problems. We demonstrate that NAPP-AL converges to a stationary solution at the rate of o(1/k) , where k is the number of iterations. Moreover, under an additional error bound condition (to be called HVP-EB in the paper) with exponent θ∈(0,1) , we further show the global convergence of NAPP-AL. Additionally, if θ∈(0,12] , then we furthermore show that the convergence rate is in fact linear. Finally, we show that the well-known Kurdyka-Łojasiewicz property and the Hölderian metric subregularity imply the aforementioned HVP-EB condition. We demonstrate that under mild conditions, NAPP-AL can also be interpreted as a variant of the forward-backward operator splitting method in this context. Funding: This work was supported by the National Natural Science Foundation of China [Grant 71871140].
- Subjects
CHINA; NONSMOOTH optimization; LAGRANGIAN functions; MACHINE learning; STATISTICAL learning
- Publication
Mathematics of Operations Research, 2024, Vol 49, Issue 1, p125
- ISSN
0364-765X
- Publication type
Article
- DOI
10.1287/moor.2022.1350