papers AI Learner
The Github is limit! Click to go to the new site.

Observability-aware Self-Calibration of Visual and Inertial Sensors for Ego-Motion Estimation

2019-01-22
Thomas Schneider, Mingyang Li, Cesar Cadena, Juan Nieto, Roland Siegwart

Abstract

External effects such as shocks and temperature variations affect the calibration of visual-inertial sensor systems and thus they cannot fully rely on factory calibrations. Re-calibrations performed on short user-collected datasets might yield poor performance since the observability of certain parameters is highly dependent on the motion. Additionally, on resource-constrained systems (e.g mobile phones), full-batch approaches over longer sessions quickly become prohibitively expensive. In this paper, we approach the self-calibration problem by introducing information theoretic metrics to assess the information content of trajectory segments, thus allowing to select the most informative parts from a dataset for calibration purposes. With this approach, we are able to build compact calibration datasets either: (a) by selecting segments from a long session with limited exciting motion or (b) from multiple short sessions where a single sessions does not necessarily excite all modes sufficiently. Real-world experiments in four different environments show that the proposed method achieves comparable performance to a batch calibration approach, yet, at a constant computational complexity which is independent of the duration of the session.

Abstract (translated by Google)
URL

http://arxiv.org/abs/1901.07242

PDF

http://arxiv.org/pdf/1901.07242


Comments

Content