A Novel Approach for Cooperative Motion Capture (COMOCAP)
Abstract
Conventional motion capture (MOCAP) systems, e.g., optical systems, typically perform well for one person, but less so for multiple people in close proximity. Measurement quality can decline with distance, and even drop out as source/sensor components are occluded by nearby people. Furthermore, conventional optical MOCAP systems estimate body posture using a global estimation approach employing cameras that are fixed in the environment, typically at a distance such that one person or object can easily occlude another, and the relative error between tracked objects in the scene can increase as they move farther from the cameras and/or closer to each other. Body-relative tracking approaches use body-worn sensors and/or sources to track limbs with respect to the head or torso, for example, taking advantage of the proximity of limbs to the body. We present a novel approach to MOCAP that combines and extends conventional global and body-relative approaches by distributing both sensing and active signaling over each person's body to facilitate body-relative (intra-user) MOCAP for one person and body-body (inter-user) MOCAP for multiple people, in an approach we call cooperative motion capture (COMOCAP). We support the validity of the approach with simulation results from a system comprised of acoustic transceivers (receiver-transmitter units) that provide inter-transceiver range measurements. Optical, magnetic, and other types of transceivers could also be used. Our simulations demonstrate the advantages of this approach to effectively improve accuracy and robustness to occlusions in situations of close proximity between multiple persons.
BibTeX
@inproceedings {10.2312:egve.20181317,
booktitle = {ICAT-EGVE 2018 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {Bruder, Gerd and Yoshimoto, Shunsuke and Cobb, Sue},
title = {{A Novel Approach for Cooperative Motion Capture (COMOCAP)}},
author = {Welch, Gregory and Wang, Tianren and Bishop, Gary and Bruder, Gerd},
year = {2018},
publisher = {The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-058-1},
DOI = {10.2312/egve.20181317}
}
booktitle = {ICAT-EGVE 2018 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments},
editor = {Bruder, Gerd and Yoshimoto, Shunsuke and Cobb, Sue},
title = {{A Novel Approach for Cooperative Motion Capture (COMOCAP)}},
author = {Welch, Gregory and Wang, Tianren and Bishop, Gary and Bruder, Gerd},
year = {2018},
publisher = {The Eurographics Association},
ISSN = {1727-530X},
ISBN = {978-3-03868-058-1},
DOI = {10.2312/egve.20181317}
}