Sensor-based human–robot collaboration for industrial tasks
Angleraud, Alexandre; Ekrekli, Akif; Samarawickrama, Kulunu; Sharma, Gaurang; Pieters, Roel (2024-04)
Angleraud, Alexandre
Ekrekli, Akif
Samarawickrama, Kulunu
Sharma, Gaurang
Pieters, Roel
04 / 2024
102663
Julkaisun pysyvä osoite on
https://urn.fi/URN:NBN:fi:tuni-202311069411
https://urn.fi/URN:NBN:fi:tuni-202311069411
Kuvaus
Peer reviewed
Tiivistelmä
Collaboration between human and robot requires interaction modalities that suit the context of the shared tasks and the environment in which it takes place. While an industrial environment can be tailored to favor certain conditions (e.g., lighting), some limitations cannot so easily be addressed (e.g., noise, dirt). In addition, operators are typically continuously active and cannot spare long time instances away from their tasks engaging with physical user interfaces. Sensor-based approaches that recognize humans and their actions to interact with a robot have therefor great potential. This work demonstrates how human–robot collaboration can be supported by visual perception models, for the detection of objects, targets, humans and their actions. For each model we present details with respect to the required data, the training of a model and its inference on real images. Moreover, we provide all developments for the integration of the models to an industrially relevant use case, in terms of software for training data generation and human–robot collaboration experiments. These are available open-source in the OpenDR toolkit at https://github.com/opendr-eu/opendr. Results are discussed in terms of performance and robustness of the models, and their limitations. Although the results are promising, learning-based models are not trivial to apply to new situations or tasks. Therefore, we discuss the challenges identified, when integrating them into an industrially relevant environment.
Kokoelmat
- TUNICRIS-julkaisut [19236]