Authors:
Antoine Billy
1
;
Sébastien Pouteau
2
;
Pascal Desbarats
2
;
Serge Chaumette
2
and
Jean-Philippe Domenger
2
Affiliations:
1
Laboratoire Bordelais de Recherches en Informatique, Université de Bordeaux, France, Innovative Imaging Solutions, Pessac and France
;
2
Laboratoire Bordelais de Recherches en Informatique, Université de Bordeaux and France
Keyword(s):
SLAM, Stereo Vision, Synthetic Dataset, Alastor, Adaptive Frame Rate Selection.
Related
Ontology
Subjects/Areas/Topics:
Active and Robot Vision
;
Applications
;
Computer Vision, Visualization and Computer Graphics
;
Geometry and Modeling
;
Image-Based Modeling
;
Motion, Tracking and Stereo Vision
;
Pattern Recognition
;
Software Engineering
;
Stereo Vision and Structure from Motion
Abstract:
In robotic mapping and navigation, of prime importance today with the trend for autonomous cars, simultaneous localization and mapping (SLAM) algorithms often use stereo vision to extract 3D information of the surrounding world. Whereas the number of creative methods for stereo-based SLAM is continuously increasing, the variety of datasets is relatively poor and the size of their contents relatively small. This size issue is increasingly problematic, with the recent explosion of deep learning based approaches, several methods require an important amount of data. Those multiple techniques contribute to enhance the precision of both localization estimation and mapping estimation to a point where the accuracy of the sensors used to get the ground truth might be questioned. Finally, because today most of these technologies are embedded on on-board systems, the power consumption and real-time constraints turn to be key requirements. Our contribution is twofold: we propose an adaptive SLAM
method that reduces the number of processed frame with minimum impact error, and we make available a synthetic flexible stereo dataset with absolute ground truth, which allows to run new benchmarks for visual odometry challenges. This dataset is available online at http://alastor.labri.fr/.
(More)