Unsupervised Clustering through Gaussian Mixture Variational AutoEncoder with Non-Reparameterized Variational Inference and Std Annealing

2020 International Joint Conference on Neural Networks (IJCNN)(2020)

引用 8|浏览19
暂无评分
摘要
Clustering has long been an important research topic in machine learning, and is highly valuable in many application tasks. In recent years, many methods have achieved high clustering performance by applying deep generative models. In this paper, we point out that directly using q(z|y, x) instead of resorting to the mean-field approximation (as is adopted in previous works) in Gaussian Mixture Variational Auto-Encoder can benefit the unsupervised clustering task. We improve the performance of Gaussian Mixture VAE, by optimizing it with a Monte Carlo objective (including the q(z|y, x) term), with non-reparameterized Variational Inference for Monte Carlo Objectives (VIMCO) method. In addition, we propose std annealing to stabilize the training process and empirically show its effects on forming well-separated embeddings with different variational inference methods. Experimental results on five benchmark datasets show that our proposed algorithm NVISA outperforms several baseline algorithms as well as the previous clustering methods based on Gaussian Mixture VAE.
更多
查看译文
关键词
Training,Annealing,Task analysis,Monte Carlo methods,Inference algorithms,Clustering methods,Neural networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要