Multi-Source Style Transfer via Style Disentanglement Network

IEEE TRANSACTIONS ON MULTIMEDIA(2024)

引用 1|浏览8
暂无评分
摘要
Despite the great success of deep neural networks for style transfer tasks, the entanglement of content and style in images leads to more style information not being captured. To tackle this problem, a novel style disentanglement network is proposed to transfer multi-source style elements. Specifically, we specialize in designing a learnable content style separation module, which can efficiently extract content and style components from images in the latent space. This method differs from the previous approaches by predefining content and style layers in the network. Under the condition of content and style separation, we continue to propose the multi-style swap module, which allows the content image to match more style elements. Additionally, by introducing alternate training strategies for the main and auxiliary decoders as well as style disentanglement loss, the stylized results look very similar to the original artworks. Experimental results demonstrate the superiority of our proposed method compared with existing schemes.
更多
查看译文
关键词
Style transfer,style disentanglement,style swap,content component,style component
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要