Interval-Based Motion Blending for Hand Grasping
Abstract
For motion to appear realistic and believable proper motion blending methods must be used in respect to the goal or task at hand. We present a method that extends the theory of move trees [MBC01] by tagging (attaching) information to each clip within a database at intervals and finding the shortest distance per tag while pruning the tree using convergence priority. Our goal is to retain the physical characteristics of motion capture data while using non-destructive blending in a goal-based scenario. With the intrinsically high dimensionality of a human hand our method also is concerned with intelligent pruning of the move tree. By constructing a move tree for hand grasping scenarios that is sampled per interval within clips and adheres to a convergence priority; we plan to develop a method that will autonomously conform a hand to the object being g
BibTeX
@inproceedings {10.2312:LocalChapterEvents:TPCG:TPCG07:201-205,
booktitle = {Theory and Practice of Computer Graphics},
editor = {Ik Soo Lim and David Duce},
title = {{Interval-Based Motion Blending for Hand Grasping}},
author = {Brisbin, Matt and Benes, Bedrich},
year = {2007},
publisher = {The Eurographics Association},
ISBN = {978-3-905673-63-0},
DOI = {10.2312/LocalChapterEvents/TPCG/TPCG07/201-205}
}
booktitle = {Theory and Practice of Computer Graphics},
editor = {Ik Soo Lim and David Duce},
title = {{Interval-Based Motion Blending for Hand Grasping}},
author = {Brisbin, Matt and Benes, Bedrich},
year = {2007},
publisher = {The Eurographics Association},
ISBN = {978-3-905673-63-0},
DOI = {10.2312/LocalChapterEvents/TPCG/TPCG07/201-205}
}