MoEVC: A Mixture-of-experts Voice Conversion System with Sparse Gating Mechanism for Accelerating Online Computation

arxiv(2019)

引用 0|浏览29
暂无评分
摘要
With the recent advancements of deep learning technologies, the performance of voice conversion (VC) in terms of quality and similarity has been significantly improved. However, heavy computations are generally required for deep-learning-based VC systems, which can cause notable latency and thus confine their deployments in real-world applications. Therefore, increasing online computation efficiency has become an important task. In this study, we propose a novel mixture-of-experts (MoE) based VC system. The MoE model uses a gating mechanism to specify optimal weights to feature maps to increase VC performance. In addition, assigning sparse constraints on the gating mechanism can accelerate online computation by skipping the convolution process by zeroing out redundant feature maps. Experimental results show that by specifying suitable sparse constraints, we can effectively increase the online computation efficiency with a notable 70% FLOPs (floating-point operations per second) reduction while improving the VC performance in both objective evaluations and human listening tests.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要