Unsupervised Representation Learning on Attributed Multiplex Network

Conference on Information and Knowledge Management(2022)

引用 0|浏览38
暂无评分
摘要
ABSTRACTEmbedding learning in multiplex networks has drawn increasing attention in recent years and achieved outstanding performance in many downstream tasks. However, most existing network embedding methods either only focus on the structured information of graphs, rely on the human-annotated data, or mainly rely on multi-layer GCNs to encode graphs at the risk of learning ill-posed spectral filters. Moreover, it is also challenging in multiplex network embedding to learn consensus embeddings for nodes across the multiple views by the inter-relationship among graphs. In this study, we propose a novel and flexible unsupervised network embedding method for attributed multiplex networks to generate more precise node embeddings by simplified Bernstein encoders and alternate contrastive learning between local and global. Specifically, we design a graph encoder based on simplified Bernstein polynomials to learn node embeddings of a specific graph view. During the learning of each specific view, local and global contrastive learning are alternately applied to update the view-specific embedding and the consensus embedding simultaneously. Furthermore, the proposed model can be easily extended as a semi-supervised model by adding additional semi-supervised cost or as an attention-based model to attentively integrate embeddings from multiple graphs. Experiments on three publicly available real-world datasets show that the proposed method achieves significant improvements on downstream tasks over state-of-the-art baselines, while being faster or competitive in terms of runtime compared to the previous studies.
更多
查看译文
关键词
attributed multiplex network embedding, graph neural network, graph representation learning, contrastive learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要