We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Aggregation for Privately Trained Different Types of Local Models.
- Authors
Chunling Han; Rui Xue
- Abstract
Machine learning has been a thriving topic in recent years, with many practical applications and active research aspects. In machine learning, model aggregation is an important area. The idea of model aggregation is to aggregate a global model from trained local models. However, traditional aggregation methods based on parameter averaging can not aggregate models which have different types and structures. Because parameter averaging will fail to average different types of values (parameters). To address this problem, we propose a new aggregation method which will suit for different types of local models. To achieve our goal, we transfer knowledge from local models to the global model. To do so, firstly, we propose differentially private GANs, let local parties generate synthetic data related to their training data. Secondly, we use the majority of prediction votes from local models to label those synthetic samples. Finally, use the labelled synthetic data to train the global model. By combining synthetic data and labels from local models, knowledge can be transferred from local models to the global model. We evaluate our scheme on Adult, MNIST and Fashion MNIST datasets under different settings, experimental results show that our scheme can achieve an accurate global model with low privacy loss. Besides, the easily implemented building blocks make our scheme efficient and practical for applications.
- Subjects
MACHINE learning; GENERATIVE adversarial networks; KNOWLEDGE transfer; RANDOM noise theory; DATA security
- Publication
EAI Endorsed Transactions on Security & Safety, 2020, Vol 7, Issue 26, p1
- ISSN
2032-9393
- Publication type
Article
- DOI
10.4108/eai.21-6-2021.170237