We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Quasi-Newton updating for large-scale distributed learning.
- Authors
Wu, Shuyuan; Huang, Danyang; Wang, Hansheng
- Abstract
Distributed computing is critically important for modern statistical analysis. Herein, we develop a distributed quasi-Newton (DQN) framework with excellent statistical, computation, and communication efficiency. In the DQN method, no Hessian matrix inversion or communication is needed. This considerably reduces the computation and communication complexity of the proposed method. Notably, related existing methods only analyse numerical convergence and require a diverging number of iterations to converge. However, we investigate the statistical properties of the DQN method and theoretically demonstrate that the resulting estimator is statistically efficient over a small number of iterations under mild conditions. Extensive numerical analyses demonstrate the finite sample performance.
- Subjects
MATRIX inversion; HESSIAN matrices; STATISTICS; NUMERICAL analysis; DISTRIBUTED computing
- Publication
Journal of the Royal Statistical Society: Series B (Statistical Methodology), 2023, Vol 85, Issue 4, p1326
- ISSN
1369-7412
- Publication type
Article
- DOI
10.1093/jrsssb/qkad059