We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Byzantine-robust distributed sparse learning for M-estimation.
- Authors
Tu, Jiyuan; Liu, Weidong; Mao, Xiaojun
- Abstract
In a distributed computing environment, there is usually a small fraction of machines that are corrupted and send arbitrary erroneous information to the master machine. This phenomenon is modeled as a Byzantine failure. Byzantine-robust distributed learning has recently become an important topic in machine learning research. In this paper, we develop a Byzantine-resilient method for the distributed sparse M-estimation problem. When the loss function is non-smooth, it is computationally costly to solve the penalized non-smooth optimization problem in a direct manner. To alleviate the computational burden, we construct a pseudo-response variable and transform the original problem into an ℓ 1 -penalized least-squares problem, which is much more computationally feasible. Based on this idea, we develop a communication-efficient distributed algorithm. Theoretically, we show that the proposed estimator obtains a fast convergence rate with only a constant number of iterations. Furthermore, we establish a support recovery result, which, to the best of our knowledge, is the first such result in the literature of Byzantine-robust distributed learning. We demonstrate the effectiveness of our approach in simulation.
- Subjects
NONSMOOTH optimization; DISTRIBUTED computing; DISTRIBUTED algorithms; MACHINE learning
- Publication
Machine Learning, 2023, Vol 112, Issue 10, p3773
- ISSN
0885-6125
- Publication type
Article
- DOI
10.1007/s10994-021-06001-x