Embedding Communication for Federated Graph Neural Networks with Privacy Guarantees

2023 IEEE 43rd International Conference on Distributed Computing Systems (ICDCS)(2023)

引用 0|浏览10
暂无评分
摘要
Graph Neural Networks (GNNs) have been widely used in many Machine Learning (ML) tasks as they show remarkable performance in modeling graph structure data. While several distributed GNN frameworks have been proposed to tackle the training of huge graphs, uploading the local graph data to the central server for model training is impractical in real-world scenarios due to privacy concerns. Federated Learning (FL) is introduced as an effective technology to address the privacy issue, allowing edge clients to collaboratively train the ML models locally. However, GNNs follow a recursive neighborhood aggregation scheme. Computing the representation vector, also known as embedding, of one node requires aggregating feature vectors of its neighbors. Training the model based on local subgraphs would suffer from information loss and result in accuracy degradation. This paper presents EmbC-FGNN, an efficient Federated _Graph Neural Network framework that enables Node Embedding Communication among training clients in a privacy-preserving way. EmbC-FGNN first proposes an Embedding Server (ES) to maintain and synchronize the shared embeddings among edge workers. It allows training devices to expand the local subgraphs with exchanged embeddings to improve the model accuracy without revealing local node features and graph topology. To minimize the communication costs of the ES, we introduce a periodic embedding synchronization strategy to reduce the communication frequency. Furthermore, we apply asynchronous training to accelerate the convergence speed. Experimental results on several graph neural networks and datasets demonstrate that EmbC-FGNN can improve the overall accuracy (more than 10% for Reddit dataset) and achieve good round-to-accuracy performance.
更多
查看译文
关键词
Federated learning, Federated Graph Neural Networks, Embedding Server, Asynchronous Training
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要