TrustDFL: A Blockchain-Based Verifiable and Trusty Decentralized Federated Learning Framework

ELECTRONICS(2024)

引用 0|浏览9
暂无评分
摘要
Federated learning is a privacy-preserving machine learning framework where multiple data owners collaborate to train a global model under the orchestra of a central server. The local training results from trainers should be submitted to the central server for model aggregation and update. Busy central server and malicious trainers can introduce the issues of a single point of failure and model poisoning attacks. To address the above issues, the trusty decentralized federated learning (called TrustDFL) framework has been proposed in this paper based on the zero-knowledge proof scheme, blockchain, and smart contracts, which provides enhanced security and higher efficiency for model aggregation. Specifically, Groth 16 is applied to generate the proof for the local model training, including the forward and backward propagation processes. The proofs are attached as the payloads to the transactions, which are broadcast into the blockchain network and executed by the miners. With the support of smart contracts, the contributions of the trainers could be verified automatically under the economic incentive, where the blockchain records all exchanged data as the trust anchor in multi-party scenarios. In addition, IPFS (InterPlanetary File System) is introduced to alleviate the storage and communication overhead brought by local and global models. The theoretical analysis and estimation results show that the TrustDFL efficiently avoids model poisoning attacks without leaking the local secrets, ensuring the global model's accuracy to be trained.
更多
查看译文
关键词
decentralized federated learning,blockchain,verifiability,zero-knowledge proof (ZKP),zk-SNARK
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要