Integrating Pretrained Encoders for Generalized Face Frontalization

IEEE Access(2024)

引用 0|浏览2
暂无评分
摘要
In the field of face frontalization, the model obtained by training on a particular dataset often underperforms on other datasets. This paper presents the Pre-trained Feature Transformation GAN (PFT-GAN), which is designed to fully utilize diverse facial feature information available from pre-trained face recognition networks. For that purpose, we propose the use of the feature attention transformation (FAT) module that effectively transfers the low-level facial features to the facial generator. On the other hand, in the hope of reducing the pre-trained encoder dependency, we attempt a new FAT module organization that accommodates the features from all pre-trained face recognition networks employed. This paper attempts evaluating the proposed work using the "independent critic" as well as "dependent critic", which enables objective judgments. Experimental results show that the proposed method significantly improves the face frontalization performance and helps overcome the bias associated with each pre-trained face recognition network employed.
更多
查看译文
关键词
Face Frontalization,Face Pose Normalization,Face Recognition,Generative Modeling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要