Abstract
We present a method for EMG-driven teleoperation of non-anthropomorphic robot hands. EMG sensors are appealing as a wearable, inexpensive, and unobtrusive way to gather information about the teleoperator’s hand pose. However, mapping from EMG signals to the pose space of a non-anthropomorphic hand presents multiple challenges. We present a method that first projects from forearm EMG into a subspace relevant to teleoperation. To increase robustness, we use a model which combines continuous and discrete predictors along different dimensions of this subspace. We then project from the teleoperation subspace into the pose space of the robot hand. Our method is effective and intuitive, as it enables novice users to teleoperate pick and place tasks faster and more robustly than state-of-the-art EMG teleoperation methods when applied to a non-anthropomorphic, multi-DOF robot hand.
Abstract (translated by Google)
URL
http://arxiv.org/abs/1809.09730