A Novel Federated Learning with Bidirectional Adaptive Differential Privacy.

ICPCSEE (1)(2023)

引用 0|浏览7
暂无评分
摘要
With the explosive growth of personal data in the era of big data, federated learning has broader application prospects, in order to solve the problem of data island and preserve user data privacy, a federated learning model based on differential privacy (DP) is proposed. Participants share the parameters after adding noise to the central server for parameter aggregation by training local data. However, there are two problems in this model: on the one hand, the data information in the process of broadcasting parameters by the central server is still compromised, with the risk of user privacy leakage; on the other hand, adding too much noise to parameters will reduce the quality of parameter aggregation and affect the model accuracy of federated learning finally. Therefore, a novel federated learning approach with bidirectional adaptive differential privacy (FedBADP) is proposed, it can adaptively add noise to the gradients transmitted by participants and central server, and protects data security without affecting model accuracy. In addition, considering the performance limitations of the participants’ hardware devices, this model samples their gradients to reduce communication overhead, and uses RMSprop to accelerate the convergence of the model on the participants and central server to improve the ac-curacy of the model. Experiments show that our novel model can not only obtain better results in accuracy, but also enhance user privacy preserving while reducing communication overhead.
更多
查看译文
关键词
novel federated learning,privacy
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要