IEEE Transactions on Neural Networks and Learning Systems | 2019
A Deep Collaborative Framework for Face Photo–Sketch Synthesis
Abstract
Great breakthroughs have been made in the accuracy and speed of face photo–sketch synthesis in recent years. Regression-based methods have gained increasing attention, which benefit from deeper and faster end-to-end convolutional neural networks. However, most of these models typically formulate the mapping from photo domain <inline-formula> <tex-math notation= LaTeX >$X$ </tex-math></inline-formula> to sketch domain <inline-formula> <tex-math notation= LaTeX >$Y$ </tex-math></inline-formula> as a unidirectional feedforward mapping, <inline-formula> <tex-math notation= LaTeX >$G: X \\to Y$ </tex-math></inline-formula>, and vice versa, <inline-formula> <tex-math notation= LaTeX >$F: Y \\to X$ </tex-math></inline-formula>; thus, the utilization of mutual interaction between two opposite mappings is lacking. Therefore, we proposed a collaborative framework for face photo–sketch synthesis. The concept behind our model was that a middle latent domain <inline-formula> <tex-math notation= LaTeX >$\\widetilde {Z}$ </tex-math></inline-formula> between the photo domain <inline-formula> <tex-math notation= LaTeX >$X$ </tex-math></inline-formula> and the sketch domain <inline-formula> <tex-math notation= LaTeX >$Y$ </tex-math></inline-formula> can be learned during the learning procedure of <inline-formula> <tex-math notation= LaTeX >$G: X \\to Y$ </tex-math></inline-formula> and <inline-formula> <tex-math notation= LaTeX >$F: Y \\to X$ </tex-math></inline-formula> by introducing a collaborative loss that makes full use of two opposite mappings. This strategy can constrain the two opposite mappings and make them more symmetrical, thus making the network more suitable for the photo–sketch synthesis task and obtaining higher quality generated images. Qualitative and quantitative experiments demonstrated the superior performance of our model in comparison with the existing state-of-the-art solutions.