papers AI Learner
The Github is limit! Click to go to the new site.

Learning-Based Animation of Clothing for Virtual Try-On

2019-03-17
Igor Santesteban, Miguel A. Otaduy, Dan Casas

Abstract

This paper presents a learning-based clothing animation method for highly efficient virtual try-on simulation. Given a garment, we preprocess a rich database of physically-based dressed character simulations, for multiple body shapes and animations. Then, using this database, we train a learning-based model of cloth drape and wrinkles, as a function of body shape and dynamics. We propose a model that separates global garment fit, due to body shape, from local garment wrinkles, due to both pose dynamics and body shape. We use a recurrent neural network to regress garment wrinkles, and we achieve highly plausible nonlinear effects, in contrast to the blending artifacts suffered by previous methods. At runtime, dynamic virtual try-on animations are produced in just a few milliseconds for garments with thousands of triangles. We show qualitative and quantitative analysis of results

Abstract (translated by Google)
URL

http://arxiv.org/abs/1903.07190

PDF

http://arxiv.org/pdf/1903.07190


Similar Posts

Comments