A Prior Of A Googol Gaussians: A Tensor Ring Induced Prior For Generative Models

ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019)(2019)

引用 25|浏览51
暂无评分
摘要
Generative models produce realistic objects in many domains, including text, image, video, and audio synthesis. Most popular models-Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs)-usually employ a standard Gaussian distribution as a prior. Previous works show that the richer family of prior distributions may help to avoid the mode collapse problem in GANs and to improve the evidence lower bound in VAEs. We propose a new family of prior distributions-Tensor Ring Induced Prior (TRIP)-that packs an exponential number of Gaussians into a high-dimensional lattice with a relatively small number of parameters. We show that these priors improve Frchet Inception Distance for GANs and Evidence Lower Bound for VAEs. We also study generative models with TRIP in the conditional generation setup with missing conditions. Altogether, we propose a novel plug-and-play framework for generative models that can be utilized in any GAN and VAE-like architectures.
更多
查看译文
关键词
tensor ring,play framework,audio synthesis,evidence lower bound
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要