Privacy-preserving federated learning based on partial low-quality data

Journal of Cloud Computing(2024)

引用 0|浏览2
暂无评分
摘要
Traditional machine learning requires collecting data from participants for training, which may lead to malicious acquisition of privacy in participants’ data. Federated learning provides a method to protect participants’ data privacy by transferring the training process from a centralized server to terminal devices. However, the server may still obtain participants’ privacy through inference attacks and other methods. In addition, the data provided by participants varies in quality, and the excessive involvement of low-quality data in the training process can render the model unusable, which is an important issue in current mainstream federated learning. To address the aforementioned issues, this paper proposes a Privacy Preserving Federated Learning Scheme with Partial Low-Quality Data (PPFL-LQDP). It can achieve good training results while allowing participants to utilize partial low-quality data, thereby enhancing the privacy and security of the federated learning scheme. Specifically, we use a distributed Paillier cryptographic mechanism to protect the privacy and security of participants’ data during the Federated training process. Additionally, we construct composite evaluation values for the data held by participants to reduce the involvement of low-quality data, thereby minimizing the negative impact of such data on the model. Through experiments on the MNIST dataset, we demonstrate that this scheme can complete the model training of federated learning with the participation of partial low-quality data, while effectively protecting the security and privacy of participants’ data. Comparisons with related schemes also show that our scheme has good overall performance.
更多
查看译文
关键词
Federated learning,Privacy protection,Low-quality data,Distributed homomorphic encryption
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要