Show simple item record

dc.contributor.authorSchubert, Ryanen_US
dc.contributor.authorBruder, Gerden_US
dc.contributor.authorWelch, Gregoryen_US
dc.contributor.editorBruder, Gerd and Yoshimoto, Shunsuke and Cobb, Sueen_US
dc.date.accessioned2018-11-06T16:07:31Z
dc.date.available2018-11-06T16:07:31Z
dc.date.issued2018
dc.identifier.isbn978-3-03868-058-1
dc.identifier.issn1727-530X
dc.identifier.urihttps://doi.org/10.2312/egve.20181316
dc.identifier.urihttps://diglib.eg.org:443/handle/10.2312/egve20181316
dc.description.abstractSpatial Augmented Reality (SAR), e.g., based on monoscopic projected imagery on physical three-dimensional (3D) surfaces, can be particularly well-suited for ad hoc group or multi-user augmented reality experiences since it does not encumber users with head-worn or carried devices. However, conveying a notion of realistic 3D shapes and movements on SAR surfaces using monoscopic imagery is a difficult challenge. While previous work focused on physical actuation of such surfaces to achieve geometrically dynamic content, we introduce a different concept, which we call ''Synthetic Animatronics,'' i.e., conveying geometric movement or deformation purely through manipulation of the imagery being shown on a static display surface. We present a model for the distribution of the viewpoint-dependent distortion that occurs when there are discrepancies between the physical display surface and the virtual object being represented, and describe a realtime implementation for a method of adaptively filtering the imagery based on an approximation of expected potential error. Finally, we describe an existing physical SAR setup well-suited for synthetic animatronics and a corresponding Unity-based SAR simulator allowing for flexible exploration and validation of the technique and various parameters.en_US
dc.publisherThe Eurographics Associationen_US
dc.subjectComputing methodologies
dc.subjectRendering
dc.subjectMixed / augmented reality
dc.subjectPerception
dc.subjectSimulation support systems
dc.titleAdaptive Filtering of Physical-Virtual Artifacts for Synthetic Animatronicsen_US
dc.description.seriesinformationICAT-EGVE 2018 - International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments
dc.description.sectionheadersSensing and Rendering
dc.identifier.doi10.2312/egve.20181316
dc.identifier.pages65-72


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record