Neural Network Based Explicit Mixture Models And Expectation-Maximization Based Learning

2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN)(2020)

引用 4|浏览19
暂无评分
摘要
We propose two neural network based mixture models in this work. The proposed mixture models are explicit. The explicit models have analytical forms with the advantages of computing likelihood and efficiency of generating samples. Expectation-maximization based algorithms are developed for learning parameters of the proposed models. We provide sufficient conditions to realize the expectation-maximization based learning. The main requirements are invertibility of neural networks that are used as generators and Jacobian computation of functional form of the neural networks. The requirements are practically realized using a flow-based neural network. In our first mixture model, we use multiple flow-based neural networks as generators. Naturally the model is complex. A single latent variable is used as the common input to all the neural networks. The second mixture model uses a single flow-based neural network as a generator to reduce complexity. The single generator has a latent variable input that follows a Gaussian mixture distribution. The proposed models are verified via training with expectation-maximization based algorithms on practical datasets. We demonstrate efficiency of proposed mixture models through extensive experiments for generating samples and maximum likelihood based classification.
更多
查看译文
关键词
Generative model, mixture models, expectation maximization, neural network, classification
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要