Please use this identifier to cite or link to this item:
Publisher DOI: 10.1016/j.procir.2022.02.184
Title: A methods-time-measurement based approach to enable action recognition for multi-variant assembly in Human-Robot Collaboration
Language: English
Authors: Koch, Julian  
Büsch, Lukas 
Gomse, Martin 
Schüppstuhl, Thorsten  
Keywords: Artificial Neural Network;Assembly;Assembly Step Recognition;Azure Kinect;Human Action Recognition;Human-Robot Collaboration;Industry 4.0;Methods-Time-Measurement;Particle Swarm Optimization;Skeleton Based Action Recognition
Issue Date: 10-Mar-2022
Publisher: Elsevier
Source: Procedia CIRP 106: 233-238 (2022)
Journal: Procedia CIRP 
Abstract (english): 
Action Recognition (AR) has become a popular approach to ensure efficient and safe Human-Robot Collaboration. Current research approaches are mostly optimized for specific assembly processes and settings. This paper introduces a novel approach to extend the field of AR to multi-variant assembly processes. The approach is based on generalized action primitives derived from Methods-Time-Measurement (MTM) analysis that are detected by an AR system using skeletal data. Subsequently a search algorithm combines the information from AR and MTM to provide an estimate of the assembly progress. One possible implementation is shown in a proof of concept and results as well as future work are discussed.
Conference: 9th CIRP Conference on Assembly Technology and Systems, CATS 2022 
DOI: 10.15480/882.4323
ISSN: 2212-8271
Institute: Flugzeug-Produktionstechnik M-23 
Document Type: Chapter/Article (Proceedings)
License: CC BY-NC-ND 4.0 (Attribution-NonCommercial-NoDerivatives) CC BY-NC-ND 4.0 (Attribution-NonCommercial-NoDerivatives)
Appears in Collections:Publications with fulltext

Files in This Item:
File Description SizeFormat
1-s2.0-S2212827122001858-main.pdfVerlags-PDF709,62 kBAdobe PDFView/Open
Show full item record

Page view(s)

Last Week
Last month
checked on May 27, 2022


checked on May 27, 2022

Google ScholarTM


Note about this record

Cite this record


This item is licensed under a Creative Commons License Creative Commons