MIA

Human-Robot Interaction at the Workplace

(Mensch-Roboter Interaktion im Arbeitsleben bewegungseingeschränkter Personen)


Motivation

Human-robot workplaces where people and robots work together cooperatively are part of the industry of tomorrow. This industry integrates new services, in the so called, "Workplace as a Service" wherein each service can be taken care individually. In a workplace setting people have their hands already occupied in different tasks; leaving room for exploring new communication and interaction technologies, which may include a robotic system. Moreover, people with disabilities may benefit from these type of technologies, since they will increase their integration into the work market.

Goals and Approach

In the MIA research project, innovative sensor technologies and interaction designs are being developed in order to make the complex robot control manageable for people who are able to move their heads and eyes. We intend to use different technologies such as: inertial measurement units (IMU), eye tracking or electrooculography (EOG), as well as provide feedback through augmented reality. In this context, our research is oriented towards testing and evaluating new concepts about robot control and interaction possibilities for humans.

Innovations and Perspectives

The research results will enable the design of a new collaborative human-robot workplace. Since this research is supported by empirical studies, a library and a manufacturing company have been established as the chosen scenarios for testing our hypotheses.

Current Work

Our current approaches focus on the following topics:

  • Ethnographic analysis of a Sheltered Workshop (Büngern Technik) for people with mobility impairments.
  • Understand modality choices under changing environmental conditions, as a potential approach for teleoperation.
  • Evaluate the use of Augmented Reality by using Augmented Visual Cues for robot teleoperation and assisted teleoperation.

Publications

Arévalo-Arboleda, Stephanie

Towards a Human-Robot Interaction Design for People with Motor Disabilities by Enhancing the Visual Space PromotionsarbeitMIA

2022.

Abstract | BibTeX | Tags: assistive robotics, augmented reality | Links:

Arévalo-Arboleda, Stephanie; Becker, Marvin; Gerken, Jens

Does One Size Fit All? A Case Study to Discuss Findings of an Augmented Hands-Free Robot Teleoperation Concept for People with and without Motor Disabilities ArtikelMIA

In: Technologies, 10 (1), 2022, ISSN: 2227-7080.

Abstract | BibTeX | Tags: augmented reality, case study, hands-free interaction, learning points, people with motor disabilities, robot teleoperation | Links:

Arévalo-Arboleda, Stephanie; Dierks, Tim; Ruecker, Franziska; Gerken, Jens

Exploring the Visual Space to Improve Depth Perception in Robot Teleoperation using Augmented Reality: The Role of Distance and Target’s Pose in Time, Success, and Certainty KonferenzbeitragMIA

In: Rosa Lanzilotti Carmelo Ardito, Alessio Malizia (Hrsg.): Human-Computer Interaction – INTERACT 2021, Springer, Cham, 2021.

Abstract | BibTeX | Tags: augmented reality, depth perception, human-robot interaction, user study | Links:

Arévalo-Arboleda, Stephanie; Pascher, Max; Baumeister, Annalies; Klein, Barbara; Gerken, Jens

Reflecting upon Participatory Design in Human-Robot Collaboration for People with Motor Disabilities: Challenges and Lessons Learned from Three Multiyear Projects KonferenzbeitragMIA

In: The 14th PErvasive Technologies Related to Assistive Environments Conference - PETRA 2021, ACM 2021, ISBN: 978-1-4503-8792-7/21/06.

Abstract | BibTeX | Tags: accessibility design, human-robot collaboration, lessons learned, participatory design | Links:

Arévalo-Arboleda, Stephanie; Ruecker, Franziska; Dierks, Tim; Gerken, Jens

Assisting Manipulation and Grasping in Robot Teleoperation with Augmented Reality Visual Cues KonferenzbeitragMIA

In: CHI Conference on Human Factors in Computing Systems (CHI '21), ACM, 2021, ISBN: 978-1-4503-8096-6/21/05.

Abstract | BibTeX | Tags: augmented reality, depth perception, hands-free interaction, human-robot interaction, teleoperation, visual cues | Links:

Arévalo-Arboleda, Stephanie; Pascher, Max; Lakhnati, Younes; Gerken, Jens

Understanding Human-Robot Collaboration for People with Mobility Impairments at the Workplace, a Thematic Analysis KonferenzbeitragMIA

In: RO-MAN 2020 - IEEE International Conference on Robot and Human Interactive Communication, IEEE, 2020, ISBN: 978-1-7281-6075-7.

Abstract | BibTeX | Tags: assistive robotics, creating human-robot relationships, hri and collaboration in manufacturing environments | Links:

Dierks, Tim

Visual Cues: Integration of object pose recognition with an augmented reality system as means to support visual perception in human-robot control AbschlussarbeitMIA

Westfälische Hochschule, Neidenburger Straße 43, 45897 Gelsenkirchen, 2020.

Abstract | BibTeX | Tags: augmented reality, hands-free interaction, human-robot interaction, pose recognition | Links:

Ruecker, Franziska

Visuelle Helfer: Ein Augmented Reality Prototyp zur Unterstützung der visuellen Wahrnehmung für die Steuerung eines Roboterarms AbschlussarbeitMIA

Westfälische Hochschule, Neidenburger Straße 43, 45897 Gelsenkirchen, 2020.

Abstract | BibTeX | Tags: augmented reality, evaluation, hands-free interaction, human-robot interaction | Links:

Arévalo-Arboleda, Stephanie; Dierks, Tim; Ruecker, Franziska; Gerken, Jens

There’s More than Meets the Eye: Enhancing Robot Control through Augmented Visual Cues KonferenzbeitragMIA

In: HRI 2020 - ACM/IEEE International Conference on Human-Robot Interaction, 2020, ISBN: 978-1-4503-7057.

Abstract | BibTeX | Tags: augmented reality, human-robot interaction, visualization | Links:

Arévalo-Arboleda, Stephanie; Miller, Stanislaw; Janka, Martha; Gerken, Jens

What's behind a choice? Understanding Modality Choices under Changing Environmental Conditions KonferenzbeitragMIA

In: ICMI '19 2019 International Conference on Multimodal Interaction, S. 291-301, 2019, ISBN: 978-1-4503-6860-5.

Abstract | BibTeX | Tags: hands-free interaction, modality choices, multimodality, point and select | Links:

Wöhle, Lukas; Miller, Stanislaw; Gerken, Jens; Gebhard, Marion

A Robust Interface for Head Motion based Control of a Robot Arm using MARG and Visual Sensors KonferenzbeitragMIA

In: 2018 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Rome, Italy, 2018.

Abstract | BibTeX | Tags: hybrid sensor system, kalman filter, magnetic immune, orientation, sensor fusion, state machine | Links:

Arévalo-Arboleda, Stephanie; Pascher, Max; Gerken, Jens

Opportunities and Challenges in Mixed-Reality for an Inclusive Human-Robot Collaboration Environment KonferenzbeitragMIA

In: Proceedings of the 2018 International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI) as part of the ACM/IEEE Conference on Human-Robot Interaction, S. 83–86, Chicago, USA, 2018.

Abstract | BibTeX | Tags: human-robot collaboration, mixed-reality, robot control, severe motor impaired | Links: