Data-driven Finger Motion Synthesis for Gesturing Characters




Capturing the body movements of actors to create animations for movies, games, and VR applications has become standard practice, but finger motions are usually added manually as a tedious postprocessing step. In this paper, we present a surprisingly simple method to automate this step for gesturing and conversing characters. In a controlled environment, we carefully captured and post-processed finger and body motions from multiple actors. To augment the body motions of virtual characters with plausible and detailed finger movements, our method selects finger motion segments from the resulting database taking into account the similarity of the arm motions and the smoothness of consecutive finger motions. We investigate which parts of the arm motion best discriminate gestures with leave-one-out cross-validation and use the result as a metric to select appropriate finger motions. Our approach provides good results for a number of examples with different gesture types and is validated in a perceptual experiment.

Paper and Video

SIGGRAPH Paper (preprint, pdf, 2MB)
Video (187MB)
SIGGRAPH Asia talk slides, pdf with embedded videos (101MB), without videos (7MB), Videos from slide 8 and slide 26.

Sophie Jörg, Jessica Hodgins, Alla Safonova. Data-driven Finger Motion Synthesis for Gesturing Characters. ACM Transactions on Graphics (SIGGRAPH Asia), November 2012, volume 31, issue 6. [bibtex]


Large Gesture Database
Conversations Database
Debates Database
Directions Database

Funding and Acknowledgments

This research is supported by:

Thanks to Justin Macey for post-processing motion capture data, Moshe Mahler for modeling the characters, Valeria Reznitskaya for video editing, and Brooke Kelly for labeling gestures.