摘要
为实现低成本的虚拟试衣,针对传统方法难以应用于实际场景的问题,提出了一种基于服装迁移的二维虚拟试衣网络。通过输入用户图像和模特图像,该网络能够生成用户穿着模特服装的试衣效果。首先通过纹理重映射提取服装纹理,再通过人物解析迁移提取用户纹理,然后合并服装纹理和用户纹理得到一个粗糙的试衣结果,最后基于半监督学习对粗糙结果中的纹理缺失区域进行自动修复,得到最终的试衣结果。定性和定量的实验均证明本文提出的方法能够较好地保留服装的纹理细节,维持用户的身份信息不发生改变,并且能够生成良好的试衣结果。
In order to achieve real-world virtual try-on technology on the cheap,a method of clothing transfer allowing the user to try on arbitrary items from the model image was proposed.The network can generate a synthesized image of the user wearing the model′s clothes by inputting the user image and the model image.Firstly,the clothing texture was extracted via texture remapping.Secondly,a parsing result was synthesized by Human Parsing Transfer.Thirdly,the texture from the clothing and the user were combined to form a coarse result.Finally,the missing regions in the coarse result were recovered via a Texture Inpainting Network trained by semi-supervised learning.Qualitative and quantitative experimental results demonstrate that the proposed approach can transfer the clothes while maintaining the garment details and the user identity.
作者
禹立
钟跃崎
YU Li;ZHONG Yueqi(College of Textiles,Donghua University,Shanghai 201620,China;Key Laboratory of Textile Science&Technology,Ministry of Education,Donghua University,Shanghai 201620,China)
出处
《毛纺科技》
CAS
北大核心
2021年第4期7-12,共6页
Wool Textile Journal
基金
国家自然科学基金(61572124)。
关键词
虚拟试衣
人物解析迁移
半监督学习
纹理修复
virtual try-on
human parsing transfer
semi-supervised learning
texture inpainting