A multi-sensorial Simultaneous Localization and Mapping (SLAM) system for low-cost micro aerial vehicles in GPS-denied environments
Authors
López Guillén, María ElenaIdentifiers
Permanent link (URI): http://hdl.handle.net/10017/43211DOI: 10.3390/s17040802
ISSN: 1424-8220
Publisher
MDPI
Date
2017-04-08Funders
Comunidad de Madrid
Universidad de Alcalá
Bibliographic citation
López, E., García, S., Barea, R., Bergasa, L.M., Molinos, E.J., Arroyo, R., Romera, E.& Pardo, S. 2017, "A multi-sensorial Simultaneous Localization and Mapping (SLAM) system for low-cost micro aerial vehicles in GPS-denied environments", Sensors 2017, 17, 802
Keywords
Sensor fusion
SLAM
Aerial robots
Project
info:eu-repo/grantAgreement/CAM//S2013%2FMIT-2748/ES/ROBOTICA APLICADA A LA MEJORA DE LA CALIDAD DE VIDA DE LOS CIUDADANOS, FASE III/RoboCity2030-III-CM
info:eu-repo/grantAgreement/UAH//CCG2016%2FEXP-049
Document type
info:eu-repo/semantics/article
Version
info:eu-repo/semantics/publishedVersion
Publisher's version
https://doi.org/10.3390/s17040802Rights
Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Access rights
info:eu-repo/semantics/openAccess
Abstract
One of the main challenges of aerial robots navigation in indoor or GPS-denied environments is position estimation using only the available onboard sensors. This paper presents a Simultaneous Localization and Mapping (SLAM) system that remotely calculates the pose and environment map of different low-cost commercial aerial platforms, whose onboard computing capacity is usually limited. The proposed system adapts to the sensory configuration of the aerial robot, by integrating different state-of-the art SLAM methods based on vision, laser and/or inertial measurements using an Extended Kalman Filter (EKF). To do this, a minimum onboard sensory configuration is supposed, consisting of a monocular camera, an Inertial Measurement Unit (IMU) and an altimeter. It allows to improve the results of well-known monocular visual SLAM methods (LSD-SLAM and ORB-SLAM are tested and compared in this work) by solving scale ambiguity and providing additional information to the EKF. When payload and computational capabilities permit, a 2D laser sensor can be easily incorporated to the SLAM system, obtaining a local 2.5D map and a footprint estimation of the robot position that improves the 6D pose estimation through the EKF. We present some experimental results with two different commercial platforms, and validate the system by applying it to their position control.
Files in this item
Files | Size | Format |
|
---|---|---|---|
Multi_sensorial_Sensors_2017.pdf | 6.944Mb |
![]() |
Files | Size | Format |
|
---|---|---|---|
Multi_sensorial_Sensors_2017.pdf | 6.944Mb |
![]() |
Collections
- ELECTRON - Artículos [152]
- ROBESAFE - Artículos [37]