Graph Decoupling Attention Markov Networks for Semisupervised Graph Node Classification

IEEE transactions on neural networks and learning systems(2023)

引用 14|浏览72
暂无评分
摘要
Graph neural networks (GNNs) have been ubiquitous in graph node classification tasks. Most GNN methods update the node embedding iteratively by aggregating its neighbors’ information. However, they often suffer from negative disturbances, due to edges connecting nodes with different labels. One approach to alleviate this negative disturbance is to use attention to learn the weights of aggregation, but current attention-based GNNs only consider feature similarity and suffer from the lack of supervision. In this article, we consider label dependency of graph nodes and propose a decoupling attention mechanism to learn both hard and soft attention. The hard attention is learned on labels for a refined graph structure with fewer interclass edges so that the aggregation’s negative disturbance can be reduced. The soft attention aims to learn the aggregation weights based on features over the refined graph structure to enhance information gains during message passing. Particularly, we formulate our model under the expectation–maximization (EM) framework, and the learned attention is used to guide label propagation in the M-step and feature propagation in the E-step, respectively. Extensive experiments are performed on six well-known benchmark graph datasets to verify the effectiveness of the proposed method.
更多
查看译文
关键词
Message passing,Task analysis,Markov random fields,Learning systems,Convolution,Feature extraction,Representation learning,Deep learning,graph convolutional networks,network representation learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要