TUHH Open Research
Help
  • Log In
    New user? Click here to register.Have you forgotten your password?
  • English
  • Deutsch
  • Communities & Collections
  • Publications
  • Research Data
  • People
  • Institutions
  • Projects
  • Statistics
  1. Home
  2. TUHH
  3. Publications
  4. Sensor fusion for the robust detection of facial regions of neonates using neural networks
 
Options

Sensor fusion for the robust detection of facial regions of neonates using neural networks

Citation Link: https://doi.org/10.15480/882.5152
Publikationstyp
Journal Article
Date Issued
2023-05-19
Sprache
English
Author(s)
Gleichauf, Johanna  
Hennemann, Lukas  
Fahlbusch, Fabian  
Hofmann, Oliver  
Niebler, Christine  
Kölpin, Alexander  orcid-logo
Institut
Hochfrequenztechnik E-3  
TORE-DOI
10.15480/882.5152
TORE-URI
http://hdl.handle.net/11420/15355
Journal
Sensors  
Volume
23
Issue
10
Article Number
4910
Citation
Sensors 23 (10): 4910 (2023)
Publisher DOI
10.3390/s23104910
Scopus ID
2-s2.0-85160396718
Publisher
Multidisciplinary Digital Publishing Institute
The monitoring of vital signs and increasing patient comfort are cornerstones of modern neonatal intensive care. Commonly used monitoring methods are based on skin contact which can cause irritations and discomfort in preterm neonates. Therefore, non-contact approaches are the subject of current research aiming to resolve this dichotomy. Robust neonatal face detection is essential for the reliable detection of heart rate, respiratory rate and body temperature. While solutions for adult face detection are established, the unique neonatal proportions require a tailored approach. Additionally, sufficient open-source data of neonates on the NICU is lacking. We set out to train neural networks with the thermal-RGB-fusion data of neonates. We propose a novel indirect fusion approach including the sensor fusion of a thermal and RGB camera based on a 3D time-of-flight (ToF) camera. Unlike other approaches, this method is tailored for close distances encountered in neonatal incubators. Two neural networks were used with the fusion data and compared to RGB and thermal networks. For the class “head” we reached average precision values of 0.9958 (RetinaNet) and 0.9455 (YOLOv3) for the fusion data. Compared with the literature, similar precision was achieved, but we are the first to train a neural network with fusion data of neonates. The advantage of this approach is in calculating the detection area directly from the fusion image for the RGB and thermal modality. This increases data efficiency by 66%. Our results will facilitate the future development of non-contact monitoring to further improve the standard of care for preterm neonates.
Subjects
non-contact monitoring
neonates
sensor fusion
neural network
face detection
MLE@TUHH
DDC Class
530: Physik
600: Technik
610: Medizin
620: Ingenieurwissenschaften
Funding Organisations
Bundesministerium für Bildung und Forschung (BMBF)  
Publication version
publishedVersion
Lizenz
https://creativecommons.org/licenses/by/4.0/
Loading...
Thumbnail Image
Name

sensors-23-04910-v2.pdf

Size

18.92 MB

Format

Adobe PDF

TUHH
Weiterführende Links
  • Contact
  • Send Feedback
  • Cookie settings
  • Privacy policy
  • Impress
DSpace Software

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science
Design by effective webwork GmbH

  • Deutsche NationalbibliothekDeutsche Nationalbibliothek
  • ORCiD Member OrganizationORCiD Member Organization
  • DataCiteDataCite
  • Re3DataRe3Data
  • OpenDOAROpenDOAR
  • OpenAireOpenAire
  • BASE Bielefeld Academic Search EngineBASE Bielefeld Academic Search Engine
Feedback