Show simple item record

dc.contributor.authorZordan, Victor B.en_US
dc.contributor.authorHorst, Nicholas C. Van Deren_US
dc.contributor.editorD. Breen and M. Linen_US
dc.date.accessioned2014-01-29T06:32:27Z
dc.date.available2014-01-29T06:32:27Z
dc.date.issued2003en_US
dc.identifier.isbn1-58113-659-5en_US
dc.identifier.issn1727-5288en_US
dc.identifier.urihttp://dx.doi.org/10.2312/SCA03/245-250en_US
dc.description.abstractMotion capture has become a premiere technique for animation of humanlike characters. To facilitate its use, researchers have focused on the manipulation of data for retargeting, editing, combining, and reusing motion capture libraries. In many of these efforts joint angle plus root trajectories are used as input, although this format requires an inherent mapping from the raw data recorded by many popular motion capture set-ups. In this paper, we propose a novel solution to this mapping problem from 3D marker position data recorded by optical motion capture systems to joint trajectories for a fixed limb-length skeleton using a forward dynamic model. To accomplish the mapping, we attach virtual springs to marker positions located on the appropriate landmarks of a physical simulation and apply resistive torques to the skeleton's joints using a simple controller. For the motion capture samples, joint-angle postures are resolved from the simulation's equilibrium state, based on the internal torques and external forces. Additional constraints, such as foot plants and hand holds, may also be treated as addition forces applied to the system and are a trivial and natural extension to the proposed technique. We present results for our approach as applied to several motion-captured behaviors.en_US
dc.publisherThe Eurographics Associationen_US
dc.titleMapping optical motion capture data to skeletal motion using a physical modelen_US
dc.description.seriesinformationSymposium on Computer Animationen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record