We found a match
Your institution may have rights to this item. Sign in to continue.
- Title
Clustered Federated Learning Based on Momentum Gradient Descent for Heterogeneous Data.
- Authors
Zhao, Xiaoyi; Xie, Ping; Xing, Ling; Zhang, Gaoyuan; Ma, Huahong
- Abstract
Data heterogeneity may significantly deteriorate the performance of federated learning since the client's data distribution is divergent. To mitigate this issue, an effective method is to partition these clients into suitable clusters. However, existing clustered federated learning is only based on the gradient descent method, which leads to poor convergence performance. To accelerate the convergence rate, this paper proposes clustered federated learning based on momentum gradient descent (CFL-MGD) by integrating momentum and cluster techniques. In CFL-MGD, scattered clients are partitioned into the same cluster when they have the same learning tasks. Meanwhile, each client in the same cluster utilizes their own private data to update local model parameters through the momentum gradient descent. Moreover, we present gradient averaging and model averaging for global aggregation, respectively. To understand the proposed algorithm, we also prove that CFL-MGD converges at an exponential rate for smooth and strongly convex loss functions. Finally, we validate the effectiveness of CFL-MGD on CIFAR-10 and MNIST datasets.
- Subjects
STATISTICAL smoothing; CONVEX functions; DATA distribution
- Publication
Electronics (2079-9292), 2023, Vol 12, Issue 9, p1972
- ISSN
2079-9292
- Publication type
Article
- DOI
10.3390/electronics12091972