papers AI Learner
The Github is limit! Click to go to the new site.

Characterizing Input Methods for Human-to-robot Demonstrations

2019-01-31
Pragathi Praveena, Guru Subramani, Bilge Mutlu, Michael Gleicher

Abstract

Human demonstrations are important in a range of robotics applications, and are created with a variety of input methods. However, the design space for these input methods has not been extensively studied. In this paper, focusing on demonstrations of hand-scale object manipulation tasks to robot arms with two-finger grippers, we identify distinct usage paradigms in robotics that utilize human-to-robot demonstrations, extract abstract features that form a design space for input methods, and characterize existing input methods as well as a novel input method that we introduce, the instrumented tongs. We detail the design specifications for our method and present a user study that compares it against three common input methods: free-hand manipulation, kinesthetic guidance, and teleoperation. Study results show that instrumented tongs provide high quality demonstrations and a positive experience for the demonstrator while offering good correspondence to the target robot.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1902.00084

PDF

http://arxiv.org/pdf/1902.00084


Comments

Content