dc.contributor.author | Abate, Andrea F. | en_US |
dc.contributor.author | Nappi, Michele | en_US |
dc.contributor.author | Ricciardi, Stefano | en_US |
dc.contributor.author | Tortora, Genoveffa | en_US |
dc.contributor.editor | Enrico Puppo and Andrea Brogni and Leila De Floriani | en_US |
dc.date.accessioned | 2014-01-27T16:34:08Z | |
dc.date.available | 2014-01-27T16:34:08Z | |
dc.date.issued | 2010 | en_US |
dc.identifier.isbn | 978-3-905673-80-7 | en_US |
dc.identifier.uri | http://dx.doi.org/10.2312/LocalChapterEvents/ItalChap/ItalianChapConf2010/033-040 | en_US |
dc.description.abstract | This paper presents a framework providing a collection of techniques to enhance reliability, accuracy and overall effectiveness of gesture-based interaction applied to the manipulation of virtual objects within a Mixed Reality context. We propose an approach characterized by a floating interface, operated by two-hand gestures, for an enhanced manipulation of 3D objects. Our interaction paradigm, exploits one-hand, twohand and time-dependent gesture patterns to allow the user to perform inherently 3D tasks, like arbitrary object rotation, or measurements of relevant features, in a more intuitive yet accurate way. A real-time adaptation to the user s needs is performed by monitoring hands and fingers motions, in order to allow both direct manipulation of virtual objects and conventional keyboard-like operations. The interface layout, whose details depend on the particular application at hand, is visualized via a stereoscopic see-through Head Mounted Display (HMD). It projects virtual interface elements, as well as application related virtual objects, in the central region of the user s field of view, floating in a close-at-hand volume. The application presented here is targeted to interactive 3D visualization of human anatomy resulting from diagnostic imaging or from virtual models aimed at training activities. The testing conducted so far shows a measurable and user-wise perceptible improvement in performing 3D interactive tasks, like the selection of a particular spot on a complex 3D surface or the distance measurement between two 3D landmarks. This study includes both qualitative and quantitative reports on the system usability. | en_US |
dc.publisher | The Eurographics Association | en_US |
dc.subject | Categories and Subject Descriptors (according to ACM CCS): H.5.1 [Multimedia Information Systems]: Artificial, augmented, and virtual realities | en_US |
dc.title | Mixed Reality and Gesture Based Interaction for Medical Imaging Applications | en_US |
dc.description.seriesinformation | Eurographics Italian Chapter Conference 2010 | en_US |