BG-HGNN: Toward Scalable and Efficient Heterogeneous Graph Neural Network
arxiv(2024)
摘要
Many computer vision and machine learning problems are modelled as learning
tasks on heterogeneous graphs, featuring a wide array of relations from diverse
types of nodes and edges. Heterogeneous graph neural networks (HGNNs) stand out
as a promising neural model class designed for heterogeneous graphs. Built on
traditional GNNs, existing HGNNs employ different parameter spaces to model the
varied relationships. However, the practical effectiveness of existing HGNNs is
often limited to simple heterogeneous graphs with few relation types. This
paper first highlights and demonstrates that the standard approach employed by
existing HGNNs inevitably leads to parameter explosion and relation collapse,
making HGNNs less effective or impractical for complex heterogeneous graphs
with numerous relation types. To overcome this issue, we introduce a novel
framework, Blend Grind-HGNN (BG-HGNN), which effectively tackles the challenges
by carefully integrating different relations into a unified feature space
manageable by a single set of parameters. This results in a refined HGNN method
that is more efficient and effective in learning from heterogeneous graphs,
especially when the number of relations grows. Our empirical studies illustrate
that BG-HGNN significantly surpasses existing HGNNs in terms of parameter
efficiency (up to 28.96 ×), training throughput (up to 8.12 ×),
and accuracy (up to 1.07 ×).
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要