papers AI Learner
The Github is limit! Click to go to the new site.

Back to the Future: Knowledge Distillation for Human Action Anticipation

2019-04-09
Vinh Tran, Yang Wang, Minh Hoai

Abstract

We consider the task of training a neural network to anticipate human actions in video. This task is challenging given the complexity of video data, the stochastic nature of the future, and the limited amount of annotated training data. In this paper, we propose a novel knowledge distillation framework that uses an action recognition network to supervise the training of an action anticipation network, guiding the latter to attend to the relevant information needed for correctly anticipating the future actions. This framework is possible thanks to a novel loss function to account for positional shifts of semantic concepts in a dynamic video. The knowledge distillation framework is a form of self-supervised learning, and it takes advantage of unlabeled data. Experimental results on JHMDB and EPIC-KITCHENS dataset show the effectiveness of our approach.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1904.04868

PDF

http://arxiv.org/pdf/1904.04868


Similar Posts

Comments