Options
Projekt Titel
Sign Language and Mouth Gesture Recognition
Startdatum
October 1, 2016
Enddatum
December 31, 2020
Loading...
In sign languages there are two different kinds of mouth patterns. Mouthing is derived from a spoken language and its detection is therefore very similar to lip reading. Mouth gestures on the other hand have formed from within the sign languages and bear no relation to spoken languages [1].
In this project we aim at automatic detection and classification of mouth gestures in German sign language to facilitate the research on usage of mouth gestures and advance on the open problem of reliable automatic sign language recognition.
To do so we are developing a deep neural network that can be trained to learn the spatiotemporal features of mouth gestures.
[1] Keller, J. (2001). Multimodal Representation and the Linguistic Status of Mouthings in German Sign Language (DGS). The Hands are the Head of the Mouth: The Mouth as Articulator in Sign Languages (S. 191-230). Hamburg: Signum Verlag.
In this project we aim at automatic detection and classification of mouth gestures in German sign language to facilitate the research on usage of mouth gestures and advance on the open problem of reliable automatic sign language recognition.
To do so we are developing a deep neural network that can be trained to learn the spatiotemporal features of mouth gestures.
[1] Keller, J. (2001). Multimodal Representation and the Linguistic Status of Mouthings in German Sign Language (DGS). The Hands are the Head of the Mouth: The Mouth as Articulator in Sign Languages (S. 191-230). Hamburg: Signum Verlag.