Dynamic PDGAN: discriminator-boosted knowledge distillation for StyleGANs

JOURNAL OF ELECTRONIC IMAGING(2024)

引用 0|浏览3
暂无评分
摘要
Generative adversarial networks have shown remarkable success in image synthesis, especially StyleGANs. Equipped with delicate and specific designs, StyleGANs are capable of synthesizing high-resolution and high-fidelity images. Previous works aiming at improving StyleGANs mainly focus on modifying the architecture of StyleGANs or transferring knowledge from other domains. However, the knowledge contained in StyleGANs trained in the same domain is still unexplored. We aim to further boost the performance of StyleGANs from the perspective of knowledge distillation, i.e., improving uncompressed StyleGANs with the aid of teacher StyleGANs trained in the same domain. Motivated by the implicit distribution contained in the pretrained teacher discriminator, we propose to exploit the teacher discriminator to additionally supervise the student generator of StyleGANs so as to leverage the knowledge in the teacher discriminator. With the proposed distillation scheme, our method can outperform original StyleGANs on several large-scale datasets, achieving state-of-the-art on AFHQv2.
更多
查看译文
关键词
generative adversarial networks,generative models,knowledge distillation
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要