papers AI Learner
The Github is limit! Click to go to the new site.

Atari-HEAD: Atari Human Eye-Tracking and Demonstration Dataset

2019-03-15
Ruohan Zhang, Zhuode Liu, Lin Guan, Luxin Zhang, Mary M Hayhoe, Dana H Ballard

Abstract

We introduce a large-scale dataset of human actions and eye movements while playing Atari videos games. The dataset currently has 44 hours of gameplay data from 16 games and a total of 2.97 million demonstrated actions. Human subjects played games in a frame-by-frame manner to allow enough decision time in order to obtain near-optimal decisions. This dataset could be potentially used for research in imitation learning, reinforcement learning, and visual saliency.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1903.06754

PDF

http://arxiv.org/pdf/1903.06754


Similar Posts

Comments