Human-Robot Interaction for a Self-determined Life

(Physische Mensch-Roboter Interaktion für ein selbstbestimmtes Leben)


Persons with multiple physical limitations, eg. high-grade paraplegics, require help in everyday life, for example for eating and drinking. Support robots that have basic interactive skills (such as support tasks and people-to-people deals) can provide new opportunities.

Goals and Approach

The aim of the project MobILe is the research and realization of basic skills with and without direct physical contact between robot and human. For robot control in three-dimensional space, we investigate humans' head and eye movements as possible input modalities, which are recorded via a headset with motion sensors or glasses with eye tracker and electrooculography. For the interaction of the robot with humans we are exploring Augmented Reality (eg in the form of visual representations of intended actions of the robot). The user-centered interaction design minimizes attention loss. A safety system with redundancies ensures functional safety. The basic body-contact skills use new control strategies that ensure that interaction continues until a clear order is issued.

Innovations and Perspectives

MobILe will also enable support services with direct physical contact in the future. The research results will be transferred to further basic robotic skills and to cooperative industrial robotics.

Current Work

Our current approaches focus on the following topics:

  • Mixed Reality as visual feedback for:
    • Intentention of the robot
    • Perception of the robot
  • Interaction Design and Intervention Strategies for people with mobility impairments (limited input modalities).
  • Ethnographic analysis like participatory observations of activities of daily living, such as eating and drinking.


Arévalo-Arboleda, Stephanie; Pascher, Max; Baumeister, Annalies; Klein, Barbara; Gerken, Jens

Reflecting upon Participatory Design in Human-Robot Collaboration for People with Motor Disabilities: Challenges and Lessons Learned from Three Multiyear Projects InproceedingsForthcomingMobILe

The 14th PErvasive Technologies Related to Assistive Environments Conference - PETRA 2021, ACM Forthcoming, ISBN: 978-1-4503-8792-7/21/06.

Abstract | BibTeX | Links:

Pascher, Max; Schneegass, Stefan; Gerken, Jens

SwipeBuddy: A Teleoperated Tablet and Ebook-Reader Holder for a Hands-Free Interaction InproceedingsMobILe

Lamas, David; Loizides, Fernando; Nacke, Lennart; Petrie, Helen; Winckler, Marco; Zaphiris, Panayiotis (Hrsg.): Human-Computer Interaction – INTERACT 2019, S. 568-571, Springer, Cham, 2019, ISBN: 978-3-030-29390-1.

Abstract | BibTeX | Links:

Pascher, Max; Baumeister, Annalies; Klein, Barbara; Schneegass, Stefan; Gerken, Jens

Little Helper: A Multi-Robot System in Home Health Care Environments InproceedingsMobILe

Proceedings of the 2019 International workshop on Human-Drone Interaction (iHDI) as part of the ACM Conference on Human Factors in Computing Systems, ACM 2019.

Abstract | BibTeX | Links:

Pascher, Max; Wöhle, Lukas

SwipeTank - A teleoperated tablet and ebook-reader holder for a hands-free interaction VortragMobILe


Abstract | BibTeX | Links:

Baumeister, Annalies; Pascher, Max; Thietje, Roland; Gerken, Jens; Klein, Barbara

Anforderungen an die Interaktion eines Roboterarms zur Nahrungsaufnahme bei Tetraplegie – Eine ethnografische Analyse InproceedingsMobILe

Kongress und Ausstellung zu Alltagsunterstützenden Assistenzlösungen / Active Assisted Living (AAL) - Tagungsband, S. 100-101, Karlsruher Messe- und Kongress GmbH Karlsruhe, 2018.

Abstract | BibTeX | Links:

Arévalo-Arboleda, Stephanie; Pascher, Max; Gerken, Jens

Opportunities and Challenges in Mixed-Reality for an Inclusive Human-Robot Collaboration Environment InproceedingsMobILe

Proceedings of the 2018 International Workshop on Virtual, Augmented, and Mixed Reality for Human-Robot Interactions (VAM-HRI) as part of the ACM/IEEE Conference on Human-Robot Interaction, S. 83–86, Chicago, USA, 2018.

Abstract | BibTeX | Links: