Research on Knowledge Distillation of Generative Adversarial Networks

2021 Data Compression Conference (DCC)(2021)

引用 0|浏览9
暂无评分
摘要
The compression of Generative Adversarial Networks (GANs) has been an emerging study in recent years. However, conventional compression methods can hardly be applied to GANs due to the training process and optimization target of GANs are different from the traditional classification detection network. Inspired by the recent success of knowledge distillation, we condense the recent researches on th...
更多
查看译文
关键词
Training,Knowledge engineering,Data compression,Generative adversarial networks,Generators,Task analysis,Optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要