papers AI Learner
The Github is limit! Click to go to the new site.

Do Autonomous Agents Benefit from Hearing?

2019-05-10
Abraham Woubie, Anssi Kanervisto, Janne Karttunen, Ville Hautamaki

Abstract

Mapping states to actions in deep reinforcement learning is mainly based on visual information. The commonly used approach for dealing with visual information is to extract pixels from images and use them as state representation for reinforcement learning agent. But, any vision only agent is handicapped by not being able to sense audible cues. Using hearing, animals are able to sense targets that are outside of their visual range. In this work, we propose the use of audio as complementary information to visual only in state representation. We assess the impact of such multi-modal setup in reach-the-goal tasks in ViZDoom environment. Results show that the agent improves its behavior when visual information is accompanied with audio features.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1905.04192

PDF

http://arxiv.org/pdf/1905.04192


Similar Posts

Comments