« Dual Quaternions skin… | Home | 3D viewer for Hermite… »

Implicit Skinning: Real-Time Skin Deformation with Contact Modeling

Publication: ACM SIGGRAPH, 2013

Rodolphe Vaillant1,2, Loïc Barthe1, Gaël Guennebaud3, Marie-Paule Cani4, Damien Rhomer5,
Brian Wyvill6, Olivier Gourmel1 and Mathias Paulin1

1IRIT - Université de Toulouse, 2University of Victoria, 3Inria Bordeaux,
4LJK - Grenoble Universités - Inria, 5CPE Lyon - Inria, 6University of Bath

dana_shot_sintel_rendering.png

 

Abstract

Geometric skinning techniques, such as smooth blending or dual-quaternions, are very popular in the industry for their high performances, but fail to mimic realistic deformations. Other methods make use of physical simulation or control volume to better capture the skin behavior, yet they cannot deliver real-time feedback. In this paper, we present the first purely geometric method handling skin contact effects and muscular bulges in real-time. The insight is to exploit the advanced composition mechanism of volumetric, implicit representations for correcting the results of geometric skinning techniques. The mesh is first approximated by a set of implicit surfaces. At each animation step, these surfaces are combined in real-time and used to adjust the position of mesh vertices, starting from their smooth skinning position. This deformation step is done without any loss of detail and seamlessly handles contacts between skin parts. As it acts as a post-process, our method fits well into the standard animation pipeline. Moreover, it requires no intensive computation step such as collision detection, and therefore provides real-time performances.

  Paper 19.7MB ]

Video


video Download video ] or [ video Use YouTube ]

Presentation

[ Talk slides ] (ppt 2013 and script embeded in notes)

Code

Here you will find the codes used for the Implicit Skinning pipeline:

Figures

Need to use some figures from our paper? go ahead:

Acknowledgments

This work has been partially funded by the IM&M project (ANR-11-JS02-007) and the advanced grant EXPRESSIVE from the European Research council. Partial funding also comes from the Natural Sciences and Engineering Research Council of Canada, the GRAND NCE, Canada and Intel Corps. Finally, this work received partial support from the Royal Society Wolfson Research Merit Award.

We thank artists, companies and universities who provided us with nice 3D models. Juna model comes from Rogério Perdiz. Dana and Carl models from the company MIXAMO. Finally the famous armadillo model comes from the Standford university 3D scan repository.

We also thank the blender foundation for providing everyone with the Blender software which we used to do some of the enhanced rendering in the video. Rendering setup is a modified version of the Sintel Lite rendering setup.