A Radial Basis Function-Based Graph Attention Network With Squeeze Loss Optimization for Link Prediction.

Jiusheng Chen, Chengyuan Fang,Xiaoyu Zhang ,Jun Wu,Runxia Guo

IEEE Trans. Artif. Intell.(2024)

引用 0|浏览3
暂无评分
摘要
Graph attention networks is a popular method to deal with link prediction tasks, but the weight assigned to each sample is not focusing on the sample's own performance in training. Moreover, since the number of links is much larger than nodes in a graph, mapping functions are usually used to map the learned node features to link features, whose expression of node similarity determines the quality of link feature learning. To tackle the above issues, a new model graph attention networks based on Radial Basis Function (RBF) with squeeze loss is proposed, including two improvements. Firstly, RBF function with extended parameters is used to transform the node features output by attention layer into link features. The result of link feature embedding can be improved by shortening the distance of nodes with links and enlarging the distance of nodes without links in vector space. Secondly, squeeze loss is designed to adjust the loss according to the performance of samples in training and change the proportion of sample loss in the loss function to allocate training resources reasonably. The link prediction task performed on datasets shows that the performance of proposed method is better than baselines.
更多
查看译文
关键词
Graph attention network,link prediction,radial basis function,squeeze loss
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要