Self-Supervised Graph Attention Networks for Deep Weighted Multi-View Clustering.

AAAI(2023)

引用 3|浏览46
暂无评分
摘要
As one of the most important research topics in the unsupervised learning field, Multi-View Clustering (MVC) has been widely studied in the past decade and numerous MVC methods have been developed. Among these methods, the recently emerged graph neural networks (GNNs) shine a light on modeling both topological structure and node attributes in the form of graphs, to guide unified embedding learning and clustering. However, existing GNN-based MVC methods generally do not give sufficient consideration to the use of self-supervised information during the training process, which prevents them from achieving better results. To this end, in this paper we propose Self-Supervised Graph Attention Networks for Deep Weighted Multi-View Clustering (SGDMC), which exploits the self-supervised information to enhance the effectiveness of the graph-based deep MVC model from two aspects. Firstly, a novel attention allocating approach that considers both the similarity of node attributes and the self-supervised information is developed to comprehensively evaluate the relevance among different nodes. Secondly, to alleviate the negative impact caused by noisy samples and the discrepancy of cluster structures, we further design a sample-weighting strategy based on the attention graphs as well as the discrepancy between the global pseudo-labels and the local cluster assignment of each single view. Experimental results on multiple real-world datasets demonstrate the effectiveness of our method over existing approaches.
更多
查看译文
关键词
graph attention networks,self-supervised,multi-view
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要