Self-Attention-Enhanced Fine-Grained Information Fusion for Multi-View Clustering

2023 38th Youth Academic Annual Conference of Chinese Association of Automation (YAC)(2023)

引用 0|浏览2
暂无评分
摘要
Traditional multi-view methods often employ neural networks to extract features for clustering tasks. However, the obtained features are typically coarse-grained descriptions of multi-view data, and their discriminating power is limited. Fine-grained information is capable of providing more comprehensive data descriptions and detailed information. As a result, exploiting fine-grained features of multi-view data is of great significance for multi-view clustering. In this paper, we propose a self-attention-enhanced fine-grained information fusion method for multi-view clustering. Specifically, a linear layer is used to map raw multi-view data from different dimensions to a common dimension. Then, a fine-grained information extraction layer consisting of two convolution layers is employed to extract fine-grained information, which allows detailed information to be represented sufficiently. The self-attention learning module is utilized to determine which information the model should focus on and to fuse important information into a new feature representation. We adopt deep divergence-based clustering to maintain the compactness within a cluster and separation between clusters. Contrastive learning is leveraged to learn consistent clustering results between different views. We conduct experiments on multiple datasets, and the results demonstrate the effectiveness of our proposed self-attention-enhanced fine-grained information fusion method for multi-view clustering.
更多
查看译文
关键词
multi-view clustering,fine-grained feature,self-attention,contrastive learning
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要