papers AI Learner
The Github is limit! Click to go to the new site.

Asynchronous 'Events' are Better For Motion Estimation

2019-04-24
Yuhu Guo, Han Xiao, Yidong Chen, Xiaodong Shi

Abstract

Event-based camera is a bio-inspired vision sensor that records intensity changes (called event) asynchronously in each pixel. As an instance of event-based camera, Dynamic and Active-pixel Vision Sensor (DAVIS) combines a standard camera and an event-based camera. However, traditional models could not deal with the event stream asynchronously. To analyze the event stream asynchronously, most existing approaches accumulate events within a certain time interval and treat the accumulated events as a synchronous frame, which wastes the intensity change information and weakens the advantages of DAVIS. Therefore, in this paper, we present the first neural asynchronous approach to process event stream for event-based camera. Our method asynchronously extracts dynamic information from events by leveraging previous motion and critical features of gray-scale frames. To our best knowledge, this is the first neural asynchronous method to analyze event stream through a novel deep neural network. Extensive experiments demonstrate that our proposed model achieves remarkable improvements against the state-of-the-art baselines.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1904.11578

PDF

http://arxiv.org/pdf/1904.11578


Similar Posts

Comments