Stochastic graph recurrent neural network

Neurocomputing(2022)

引用 4|浏览18
暂无评分
摘要
Representation learning over dynamic graphs has attracted much attention because of its wide applications. Recently, sequential probabilistic generative models have achieved impressive results because they can model data distributions. However, modeling the distribution of dynamic graphs is still extremely challenging. Existing methods usually ignore the mutual interference of stochastic states and deterministic states. Besides, the assumption that latent variables follow Gaussian distributions is usually inappropriate. To address these problems, we propose stochastic graph recurrent neural network (SGRNN), a sequential generative model for the representation learning over dynamic graphs. It separates stochastic states and deterministic states in the iterative process. To improve the flexibility of latent variables, we set the prior distribution and posterior distribution as semi-implicit distributions and propose DSI-SGRNN. In addition, to alleviate the KL-vanishing problem in SGRNN, a simple and interpretable structure is proposed based on the lower bound of KL-divergence. The proposed structure introduces a few extra parameters and can be implemented with a few lines of code modification. Extensive experiments on real-world datasets demonstrate the effectiveness of the proposed model.
更多
查看译文
关键词
Dynamic graph,Representation learning,Variational inference,Posterior collapse
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要