Privacy preserving distributed training of neural networks

Neural Comput. Appl.(2020)

引用 10|浏览1
暂无评分
摘要
L earnae is a system aiming to achieve a fully distributed way of neural network training. It follows a “Vires in Numeris” approach, combining the resources of commodity personal computers. It has a full peer-to-peer model of operation; all participating nodes share the exact same privileges and obligations. Another significant feature of L earnae is its high degree of fault tolerance. All training data and metadata are propagated through the network using resilient gossip protocols. This robust approach is essential in environments with unreliable connections and frequently changing set of nodes. It is based on a versatile working scheme and supports different roles, depending on processing power and training data availability of each peer. In this way, it allows an expanded application scope, ranging from powerful workstations to online sensors. To maintain a decentralized architecture, all underlying tech should be fully distributed too. L earnae ’s coordinating algorithm is platform agnostic, but for the purpose of this research two novel projects have been used: (1) IPFS, a decentralized filesystem, as a means to distribute data in a permissionless environment and (2) IOTA, a decentralized network targeting the world of low energy “Internet of Things” devices. In our previous work, a first approach was attempted on the feasibility of using distributed ledger technology to collaboratively train a neural network. Now, our research is extended by applying L earnae to a fully deployed computer network and drawing the first experimental results. This article focuses on use cases that require data privacy; thus, there is only exchanging of model weights and not training data.
更多
查看译文
关键词
Decentralized neural network training,Data privacy,Weight averaging,Distributed ledger technology,IPFS,IOTA
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要