Deep Heterogeneous Graph Neural Networks via Similarity Regularization Loss and Hierarchical Fusion.

ICDM (Workshops)(2022)

引用 0|浏览3
暂无评分
摘要
Recently, Graph Neural Networks (GNNs) have emerged as a promising and powerful method for tackling graph-structured data. However, most real-world graph-structured data contains distinct types of objects (nodes) and links (edges), which is called heterogeneous graph. The heterogeneity and rich semantic information indeed increase the difficulties in handling heterogeneous graph. Most of the current heterogeneous graph neural networks (HeteGNNs) can only build on a very shallow structure. This is caused by a phenomenon called semantic confusion, where the node embeddings become indistinguishable with the growth of model depth, leading to the degradation of the model performance. In this paper, we address this problem by proposing a similarity regularization loss and hierarchical fusion based heterogeneous graph neural networks (SHGNN). The hierarchical fusion strategy is utilized to fuse the features of the node embeddings at each layer, which can improve the expressive power of the model, and then a similarity regularization loss is introduced, by which the problem of indistinguishability among nodes can be alleviated. Our approach is flexible to combine various HeteGNNs effectively. Experimental results on real-world heterogeneous graph-structured data demonstrate the state-of-the-art performance of the proposed approach, which can efficiently mitigate the semantic confusion problem.
更多
查看译文
关键词
similarity regularization loss,networks,hierarchical fusion,deep
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要