TUHH Open Research
Help
  • Log In
    New user? Click here to register.Have you forgotten your password?
  • English
  • Deutsch
  • Communities & Collections
  • Publications
  • Research Data
  • People
  • Institutions
  • Projects
  • Statistics
  1. Home
  2. TUHH
  3. Publications
  4. Optimised preprocessing for automatic mouth gesture classification
 
Options

Optimised preprocessing for automatic mouth gesture classification

Citation Link: https://doi.org/10.15480/882.3614
Publikationstyp
Conference Paper
Date Issued
2020
Sprache
English
Author(s)
Brumm, Maren  
Grigat, Rolf-Rainer  
Institut
Bildverarbeitungssysteme E-2  
TORE-DOI
10.15480/882.3614
TORE-URI
http://hdl.handle.net/11420/9735
Start Page
27
End Page
32
Citation
Proceedings of the 9th Workshop on the Representation and Processing of Sign Languages: 27–32 (2020)
Contribution to Conference
12th International Conference on Language Resources and Evaluation, LREC  
Publisher
European Language Resources Association (ELRA)
Mouth gestures are facial expressions in sign language, that do not refer to lip patterns of a spoken language. Research on this topic
has been limited so far. The aim of this work is to automatically classify mouth gestures from video material by training a neural
network. This could render time-consuming manual annotation unnecessary and help advance the field of automatic sign language
translation. However, it is a challenging task due to the little data available as training material and the similarity of different mouth
gesture classes. In this paper we focus on the preprocessing of the data, such as finding the area of the face important for mouth gesture
recognition. Furthermore we analyse the duration of mouth gestures and determine the optimal length of video clips for classification.
Our experiments show, that this can improve the classification results significantly and helps to reach a near human accuracy.
Subjects
Sign Language Recognition/Generation
Machine Translation
SpeechToSpeech Translation
Statistical and Machine Learning Methods
DDC Class
004: Informatik
More Funding Information
This publication has been produced in the context of the joint research funding of the German Federal Government and Federal States in the Academies’ Programme, with funding from the Federal Ministry of Education and Research and the Free and Hanseatic City of Hamburg. The Academies’ Programme is coordinated by the Union of the German Academies of Sciences and Humanities.
Publication version
publishedVersion
Lizenz
https://creativecommons.org/licenses/by-nc/4.0/
Loading...
Thumbnail Image
Name

2020.signlanglrec-1.5.pdf

Size

637.79 KB

Format

Adobe PDF

TUHH
Weiterführende Links
  • Contact
  • Send Feedback
  • Cookie settings
  • Privacy policy
  • Impress
DSpace Software

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science
Design by effective webwork GmbH

  • Deutsche NationalbibliothekDeutsche Nationalbibliothek
  • ORCiD Member OrganizationORCiD Member Organization
  • DataCiteDataCite
  • Re3DataRe3Data
  • OpenDOAROpenDOAR
  • OpenAireOpenAire
  • BASE Bielefeld Academic Search EngineBASE Bielefeld Academic Search Engine
Feedback