Sensitivity-optimized Rigging for Example-based Real-time Clothing Synthesis


We present a real time solution for generating detailed clothing animations from pre-computed example clothing drapes. Given an input pose, our method synthesizes a clothing drape by blending clothing deformations predicted from nearby examples. We show sensitivity analysis provides an optimal way for this prediction and blending procedure. Sensitivity-optimized rigging computes each example’s plausible clothing deformation as a rigged mesh. The rigging’s weights are optimized so that its linear responses agree with an equilibrium simulation under small perturbations of the example pose. This compact rigging scheme well models the global influence of the underline body motion to clothing deformation. We also develop a sensitivity optimized blending scheme for measuring the distance between poses according to their contribution to cloth deformation. For offline sampling, we propose a greedy scheme for sampling the pose space and computing example clothing drapes. Our solution is fast, compact and can generate physically plausible clothing animation results for various kinds of clothes in real time. We demonstrate the efficiency of our solution with results generated from different cloth types and body motions.

Download publication

Related Resources

See what’s new.



Results of the Enumeration of Costas Arrays of Order 27

This correspondence presents the results of the enumeration of Costas…



Active Printed Materials for Complex Self-Evolving Deformations

We propose a new design of complex self-evolving structures that vary…



A Multiple-Scale Stochastic Modelling Primitive

Stochastic modeling has been successfully used in computer graphics to…



Fusion 360 Gallery: A Dataset and Environment for Programmatic CAD Construction from Human Design Sequences

Parametric computer-aided design (CAD) is a standard paradigm used to…

Get in touch

Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.

Contact us