Human-robot interaction with pointing gestures : intuitive interaction between co-located humans and robots
104 p
Thèse de doctorat: Università della Svizzera italiana, 2020
English
Human-robot interaction (HRI) is an active area of research and an essential component for the effective integration of mobile robots in everyday environments. In this PhD work, we studied, designed, implemented, and experimentally validated new efficient interaction modalities between humans and robots that share the same workspace. The core of the work revolves around deictic (pointing) gestures - a skill that humans develop at an early age and use throughout their lives to reference other people, animals, objects, and locations in the surrounding space. To use pointing to control robots, gestures have to be correctly perceived and interpreted by the system. This requires one to model human kinematics and perception, estimate pointed directions and locations using external or wearable sensors, localize robots and humans with respect to each other, and timely provide feedback to let users correct for any inaccuracies in the interaction process. Our main contributions to state of the art lie at the intersection of related topics in psychology, human-robot and human-computer interaction research. In particular, we designed, implemented, and experimentally validated in real-world user studies: - a pointing-based relative localization method and its application to robot identification and engagement; - an approach for pointing-based control of robots on 2D plane and robots freely moving in 3D space; - efficient interaction feedback modalities based on robot motion, lights, and sounds.
-
Language
-
-
Classification
-
Computer science and technology
-
License
-
License undefined
-
Identifiers
-
-
Persistent URL
-
https://n2t.net/ark:/12658/srd1319137