Human-Robot Interaction at the Workplace

(Mensch-Roboter Interaktion im Arbeitsleben bewegungseingeschränkter Personen)


Human-robot workplaces where people and robots work together cooperatively are part of the industry of tomorrow. This industry integrates new services, in the so called, "Workplace as a Service" wherein each service can be taken care individually. In a workplace setting people have their hands already occupied in different tasks; leaving room for exploring new communication and interaction technologies, which may include a robotic system. Moreover, people with disabilities may benefit from these type of technologies, since they will increase their integration into the work market.

Goals and Approach

In the MIA research project, innovative sensor technologies and interaction designs are being developed in order to make the complex robot control manageable for people who are able to move their heads and eyes. We intend to use different technologies such as: inertial measurement units (IMU), eye tracking or electrooculography (EOG), as well as provide feedback through augmented reality. In this context, our research is oriented towards testing and evaluating new concepts about robot control and interaction possibilities for humans.

Innovations and Perspectives

The research results will enable the design of a new collaborative human-robot workplace. Since this research is supported by empirical studies, a library and a manufacturing company have been established as the chosen scenarios for testing our hypotheses.

Current Work

Our current approaches focus on the following topics:

  • Ethnographic analysis of a Sheltered Workshop (Büngern Technik) for people with mobility impairments.
  • Understand modality choices under changing environmental conditions, as a potential approach for teleoperation.
  • Evaluate the use of Augmented Reality by using Augmented Visual Cues for robot teleoperation and assisted teleoperation.


Arévalo-Arboleda, Stephanie; Dierks, Tim; Ruecker, Franziska; Gerken, Jens

There’s More than Meets the Eye: Enhancing Robot Control through Augmented Visual Cues InproceedingsForthcomingMIA

HRI 2020 - ACM/IEEE International Conference on Human-Robot Interaction, Forthcoming, ISBN: 978-1-4503-7057.

Abstract | BibTeX | Links:

Arévalo-Arboleda, Stephanie; Miller, Stanislaw; Janka, Martha; Gerken, Jens

What's behind a choice? Understanding Modality Choices under Changing Environmental Conditions InproceedingsMIA

ICMI '19 2019 International Conference on Multimodal Interaction, S. 291-301, 2019, ISBN: 978-1-4503-6860-5.

Abstract | BibTeX | Links:

Wöhle, Lukas; Miller, Stanislaw; Gerken, Jens; Gebhard, Marion

A Robust Interface for Head Motion based Control of a Robot Arm using MARG and Visual Sensors InproceedingsMIA

2018 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Rome, Italy, 2018.

Abstract | BibTeX | Links:

Arévalo-Arboleda, Stephanie; Pascher, Max; Gerken, Jens

Opportunities and Challenges in Mixed-Reality for an Inclusive Human-Robot Collaboration Environment InproceedingsMIA

Proceedings of the 2018 International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI) as part of the ACM/IEEE Conference on Human-Robot Interaction, S. 83–86, Chicago, USA, 2018.

Abstract | BibTeX | Links: