We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization.
- Authors
Gao, Juan; Liu, Xin-Wei; Dai, Yu-Hong; Huang, Yakui; Gu, Junhua
- Abstract
We consider a distributed non-convex optimization problem of minimizing the sum of all local cost functions over a network of agents. This problem often appears in large-scale distributed machine learning, known as non-convex empirical risk minimization. In this paper, we propose two accelerated algorithms, named DSGT-HB and DSGT-NAG, which combine the distributed stochastic gradient tracking (DSGT) method with momentum accelerated techniques. Under appropriate assumptions, we prove that both algorithms sublinearly converge to a neighborhood of a first-order stationary point of the distributed non-convex optimization. Moreover, we derive the conditions under which DSGT-HB and DSGT-NAG achieve a network-independent linear speedup. Numerical experiments for a distributed non-convex logistic regression problem on real data sets and a deep neural network on the MNIST database show the superiorities of DSGT-HB and DSGT-NAG compared with DSGT.
- Subjects
DISTRIBUTED algorithms; COST functions; MACHINE learning; DATABASES; LOGISTIC regression analysis
- Publication
Computational Optimization & Applications, 2023, Vol 84, Issue 2, p531
- ISSN
0926-6003
- Publication type
Article
- DOI
10.1007/s10589-022-00432-5