Style-Controlled Synthesis of Clothing Segments for Fashion Image Manipulation

You are here

IEEE Transactions on Multimedia

Top Reasons to Join SPS Today!

1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.

Style-Controlled Synthesis of Clothing Segments for Fashion Image Manipulation

By: 
Bo-Kyeong Kim; Geonmin Kim; Soo-Young Lee

We propose an approach for digitally altering people's outfits in images. Given images of a person and a desired clothing style, our method generates a new clothing item image. The new item displays the color and pattern of the desired style while geometrically mimicking the person's original item. Through superimposition, the altered image is made to look as if the person is wearing the new item. Unlike recent works with full-image synthesis, our work relies on segment synthesis, yielding benefits in virtual try-on. For the synthesis process, we assume two underlying factors characterizing clothing segments: geometry and style. These two factors are disentangled via preprocessing and combined using a neural network. We explore several networks and introduce important aspects of the architecture and learning process. Our experimental results are three-fold: 1) on images from fashion-parsing datasets, we demonstrate the generation of high-quality clothing segments with fine-level style control; 2) on a virtual try-on benchmark, our method shows superiority over prior synthesis methods; and 3) in transferring clothing styles, we visualize the differences between our method and neural style transfer.

SPS Social Media

IEEE SPS Educational Resources

IEEE SPS Resource Center

IEEE SPS YouTube Channel