Please use this identifier to cite or link to this item:
https://doi.org/10.15480/882.4323
Publisher DOI: | 10.1016/j.procir.2022.02.184 | Title: | A methods-time-measurement based approach to enable action recognition for multi-variant assembly in Human-Robot Collaboration | Language: | English | Authors: | Koch, Julian ![]() Büsch, Lukas Gomse, Martin Schüppstuhl, Thorsten ![]() |
Keywords: | Artificial Neural Network;Assembly;Assembly Step Recognition;Azure Kinect;Human Action Recognition;Human-Robot Collaboration;Industry 4.0;Methods-Time-Measurement;Particle Swarm Optimization;Skeleton Based Action Recognition | Issue Date: | 10-Mar-2022 | Publisher: | Elsevier | Source: | Procedia CIRP 106: 233-238 (2022) | Journal: | Procedia CIRP | Abstract (english): | Action Recognition (AR) has become a popular approach to ensure efficient and safe Human-Robot Collaboration. Current research approaches are mostly optimized for specific assembly processes and settings. This paper introduces a novel approach to extend the field of AR to multi-variant assembly processes. The approach is based on generalized action primitives derived from Methods-Time-Measurement (MTM) analysis that are detected by an AR system using skeletal data. Subsequently a search algorithm combines the information from AR and MTM to provide an estimate of the assembly progress. One possible implementation is shown in a proof of concept and results as well as future work are discussed. |
Conference: | 9th CIRP Conference on Assembly Technology and Systems, CATS 2022 | URI: | http://hdl.handle.net/11420/12418 | DOI: | 10.15480/882.4323 | ISSN: | 2212-8271 | Institute: | Flugzeug-Produktionstechnik M-23 | Document Type: | Chapter/Article (Proceedings) | License: | ![]() |
Appears in Collections: | Publications with fulltext |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
1-s2.0-S2212827122001858-main.pdf | Verlags-PDF | 709,62 kB | Adobe PDF | View/Open![]() |
Page view(s)
57
Last Week
18
18
Last month
checked on May 27, 2022
Download(s)
14
checked on May 27, 2022
Google ScholarTM
Check
Note about this record
Cite this record
Export
This item is licensed under a Creative Commons License