Gesture-Based Communication in Human-Computer Interaction [electronic resource] : International GestureWorkshop, GW’99 Gif-sur-Yvette, France, March 17-19, 1999 Proceedings /

Human Perception and Production of Gesture -- Seeing Biological Motion - Is There a Role for Cognitive Strategies? -- The Expressive Power of Gestures: Capturing Scent in a Spatial Shape -- Non-obvious Performer Gestures in Instrumental Music -- The Ecological Approach to Multimodal System Design -- Analysis of Trunk and Upper Limb Articular Synergies -- Localisation and Segmentation -- GREFIT: Visual Recognition of Hand Postures -- Towards Imitation Learning of Grasping Movements by an Autonomous Robot -- A Line-Scan Computer Vision Algorithm for Identifying Human Body Features -- Hand Posture Recognition in a Body-Face Centered Space -- Recognition -- Vision-Based Gesture Recognition: A Review -- Person Localization and Posture Recognition for Human-Robot Interaction -- Statistical Gesture Recognition Through Modelling of Parameter Trajectories -- Gesture Recognition for Visually Mediated Interaction -- Interpretation of Pointing Gestures: The PoG System -- Control of In-vehicle Systems by Gestures -- Sign Language -- French Sign Language: Proposition of a Structural Explanation by Iconicity -- HMM-Based Continuous Sign Language Recognition Using Stochastic Grammars -- A Method for Analyzing Spatial Relationships Between Words in Sign Language Recognition -- Toward Scalability in ASL Recognition: Breaking Down Signs into Phonemes -- Gesture Synthesis and Animation -- A Complete System for the Specification and the Generation of Sign Language Gestures -- Sign Specification and Synthesis -- Active Character: Dynamic Reaction to the User -- Reactiva’Motion Project: Motion Synthesis Based on a Reactive Representation -- The Emotional Avatar: Non-verbal Communication Between Inhabitants of Collaborative Virtual Environments -- Multimodality -- Communicative Rhythm in Gesture and Speech -- Temporal Symbolic Integration Applied to a Multimodal System Using Gestures and Speech -- A Multimodal Interface Framework for Using Hand Gestures and Speech in Virtual Environment Applications -- Round Table -- Stimulating Research into Gestural Human Machine Interaction.

Saved in:
Bibliographic Details
Main Authors: Braffort, Annelies. editor., Gherbi, Rachid. editor., Gibet, Sylvie. editor., Teil, Daniel. editor., Richardson, James. editor., SpringerLink (Online service)
Format: Texto biblioteca
Language:eng
Published: Berlin, Heidelberg : Springer Berlin Heidelberg, 1999
Subjects:Computer science., User interfaces (Computer systems)., Artificial intelligence., Computational linguistics., Computer graphics., Image processing., Pattern recognition., Computer Science., Language Translation and Linguistics., User Interfaces and Human Computer Interaction., Image Processing and Computer Vision., Artificial Intelligence (incl. Robotics)., Pattern Recognition., Computer Graphics.,
Online Access:http://dx.doi.org/10.1007/3-540-46616-9
Tags: Add Tag
No Tags, Be the first to tag this record!