Publication
Sensitivity-optimized Rigging for Example-based Real-time Clothing Synthesis
Abstract
We present a real time solution for generating detailed clothing animations from pre-computed example clothing drapes. Given an input pose, our method synthesizes a clothing drape by blending clothing deformations predicted from nearby examples. We show sensitivity analysis provides an optimal way for this prediction and blending procedure. Sensitivity-optimized rigging computes each example’s plausible clothing deformation as a rigged mesh. The rigging’s weights are optimized so that its linear responses agree with an equilibrium simulation under small perturbations of the example pose. This compact rigging scheme well models the global influence of the underline body motion to clothing deformation. We also develop a sensitivity optimized blending scheme for measuring the distance between poses according to their contribution to cloth deformation. For offline sampling, we propose a greedy scheme for sampling the pose space and computing example clothing drapes. Our solution is fast, compact and can generate physically plausible clothing animation results for various kinds of clothes in real time. We demonstrate the efficiency of our solution with results generated from different cloth types and body motions.
Download publicationRelated Resources
See what’s new.
2023
Meet the Insider: A Conversation with Amr Raafat, CIO of Windover ConstructionWindover Construction’s CIO Amr Raafat discusses innovation and…
2013
Design Tools for the Rest of Us: Maker Hardware Requires Maker SoftwareIn our own work, we are developing and applying a system which…
2022
VideoPoseVR: Authoring Virtual Reality Character Animations with Online VideosWe present VideoPoseVR, a video-based animation authoring workflow…
2016
How smart materials will literally reshape the world around usHow did we arrive here? Design and engineering used to focus on…
Get in touch
Something pique your interest? Get in touch if you’d like to learn more about Autodesk Research, our projects, people, and potential collaboration opportunities.
Contact us