Paper
27 January 2021 Makeup transfer based on multi-scale feature loss
Qiuxia Yang, Yuanyuan Pu, Xu Dan, Wenhua Qian, Jun Xu
Author Affiliations +
Proceedings Volume 11720, Twelfth International Conference on Graphics and Image Processing (ICGIP 2020); 117200S (2021) https://doi.org/10.1117/12.2589402
Event: Twelfth International Conference on Graphics and Image Processing, 2020, Xi'an, China
Abstract
Facial makeup transfer is a hot research field in computer vision, aiming to transfer the reference face's makeup style to the non-makeup face. In the existing research, we can use to combat the loss to keep the identity information of the face consistent before and after the makeup transfer, but because the input face sometimes has a massive deflection and expression action, which seriously affects the effect of makeup transfer. This paper proposed a face makeup transfer framework based on multi-scale feature loss. Our model is composed of a generator, discriminator, and multi-scale discriminator. The reference makeup and the non-makeup face are input into the generator at the same time. The generator will output the non-makeup face after makeup, which contains the input non-makeup face's identity information and the reference makeup style. To enhance the robustness of the makeup result on the network and improve the effect of makeup on the network, the output makeup face and the input non-makeup face are input into the multiscale discriminator to calculate the feature loss. The multi-scale discriminator takes the pixel multiplication of the semantic segmentation image of the makeup face and the non-makeup face as the input 1 of the multi-scale discriminator, and the pixel multiplication of the non-makeup face and its semantic segmentation as the input 2 of the multi-scale discriminator, and then calculates the feature loss of the two inputs after passing through the multi-scale discriminator. In the calculation process, the feature loss can constrain the pixel differences of different semantic segmentation, which can restrain the shadow and makeup overflow caused by angle deflection and facial expression in makeup transfer. The experimental results show that this paper's experimental method achieves a better effect of makeup transfer than the existing experimental methods.
© (2021) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Qiuxia Yang, Yuanyuan Pu, Xu Dan, Wenhua Qian, and Jun Xu "Makeup transfer based on multi-scale feature loss", Proc. SPIE 11720, Twelfth International Conference on Graphics and Image Processing (ICGIP 2020), 117200S (27 January 2021); https://doi.org/10.1117/12.2589402
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
Back to Top