papers AI Learner
The Github is limit! Click to go to the new site.

Talking with Your Hands: Scaling Hand Gestures and Recognition with CNNs

2019-05-10
Okan Köpüklü, Yao Rong, Gerhard Rigoll

Abstract

The use of hand gestures provides a natural alternative to cumbersome interface devices for Human-Computer Interaction (HCI) systems. As the technology advances and communication between humans and machines becomes more complex, HCI systems should also be scaled accordingly in order to accommodate the introduced complexities. In this paper, we propose a methodology to scale hand gestures by forming them with predefined gesture-phonemes, and a convolutional neural network (CNN) based framework to recognize hand gestures by learning only their constituents of gesture-phonemes. The total number of possible hand gestures can be increased exponentially by increasing the number of used gesture-phonemes. For this objective, we introduce a new benchmark dataset named Scaled Hand Gestures Dataset (SHGD) with only gesture-phonemes in its training set and 3-tuples gestures in the test set. In our experimental analysis, we achieve to recognize hand gestures containing one and three gesture-phonemes with an accuracy of 98.47% (in 15 classes) and 94.69% (in 810 classes), respectively. Our dataset, code and pretrained models are publicly available.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1905.04225

PDF

http://arxiv.org/pdf/1905.04225


Similar Posts

Comments