Training Encrypted Models With Privacy-Preserved Data On Blockchain

ICVISP 2019: PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON VISION, IMAGE AND SIGNAL PROCESSING(2019)

引用 2|浏览2
暂无评分
摘要
Currently, training neural networks often requires a large corpus of data from multiple parties. However, data owners are reluctant to share their sensitive data to third parties for modelling in many cases. Therefore, Federated Learning (FL) has arisen as an alternative to enable collaborative training of models without sharing raw data, by distributing modelling tasks to multiple data owners. Based on FL, we premodel sent a novel and decentralized approach to training encrypted models with privacy-preserved data on Blockchain. In our approach, Blockchain is adopted as the machine learning environment where different actors (i.e., the model provider, the data provider) collaborate on the training task. During the training process, an encryption algorithm is used to protect the privacy of data and the trained model. Our experiments demonstrate that our approach is practical in real-world applications.
更多
查看译文
关键词
Blockchain, Machine, Learning Distributed, Learning Privacy
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要