We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Nonsmooth Nonconvex Stochastic Heavy Ball.
- Authors
Le, Tam
- Abstract
Motivated by the conspicuous use of momentum-based algorithms in deep learning, we study a nonsmooth nonconvex stochastic heavy ball method and show its convergence. Our approach builds upon semialgebraic (definable) assumptions commonly met in practical situations and combines a nonsmooth calculus with a differential inclusion method. Additionally, we provide general conditions for the sample distribution to ensure the convergence of the objective function. Our results are general enough to justify the use of subgradient sampling in modern implementations that heuristically apply rules of differential calculus on nonsmooth functions, such as backpropagation or implicit differentiation. As for the stochastic subgradient method, our analysis highlights that subgradient sampling can make the stochastic heavy ball method converge to artificial critical points. Thanks to the semialgebraic setting, we address this concern showing that these artifacts are almost surely avoided when initializations are randomized, leading the method to converge to Clarke critical points.
- Subjects
SEMIALGEBRAIC sets; MACHINE learning; SUBGRADIENT methods; DEEP learning; NONSMOOTH optimization; DIFFERENTIAL calculus
- Publication
Journal of Optimization Theory & Applications, 2024, Vol 201, Issue 2, p699
- ISSN
0022-3239
- Publication type
Article
- DOI
10.1007/s10957-024-02408-3