A multi-sensorial Simultaneous Localization and Mapping (SLAM) system for low-cost micro aerial vehicles in GPS-denied environments
Autores
López Guillén, María ElenaIdentificadores
Enlace permanente (URI): http://hdl.handle.net/10017/43211DOI: 10.3390/s17040802
ISSN: 1424-8220
Editor
MDPI
Fecha de publicación
2017-04-08Patrocinadores
Comunidad de Madrid
Universidad de Alcalá
Cita bibliográfica
López, E., García, S., Barea, R., Bergasa, L.M., Molinos, E.J., Arroyo, R., Romera, E.& Pardo, S. 2017, "A multi-sensorial Simultaneous Localization and Mapping (SLAM) system for low-cost micro aerial vehicles in GPS-denied environments", Sensors 2017, 17, 802
Palabras clave
Sensor fusion
SLAM
Aerial robots
Proyectos
info:eu-repo/grantAgreement/CAM//S2013%2FMIT-2748/ES/ROBOTICA APLICADA A LA MEJORA DE LA CALIDAD DE VIDA DE LOS CIUDADANOS, FASE III/RoboCity2030-III-CM
info:eu-repo/grantAgreement/UAH//CCG2016%2FEXP-049
Tipo de documento
info:eu-repo/semantics/article
Versión
info:eu-repo/semantics/publishedVersion
Versión del editor
https://doi.org/10.3390/s17040802Derechos
Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Derechos de acceso
info:eu-repo/semantics/openAccess
Resumen
One of the main challenges of aerial robots navigation in indoor or GPS-denied environments is position estimation using only the available onboard sensors. This paper presents a Simultaneous Localization and Mapping (SLAM) system that remotely calculates the pose and environment map of different low-cost commercial aerial platforms, whose onboard computing capacity is usually limited. The proposed system adapts to the sensory configuration of the aerial robot, by integrating different state-of-the art SLAM methods based on vision, laser and/or inertial measurements using an Extended Kalman Filter (EKF). To do this, a minimum onboard sensory configuration is supposed, consisting of a monocular camera, an Inertial Measurement Unit (IMU) and an altimeter. It allows to improve the results of well-known monocular visual SLAM methods (LSD-SLAM and ORB-SLAM are tested and compared in this work) by solving scale ambiguity and providing additional information to the EKF. When payload and computational capabilities permit, a 2D laser sensor can be easily incorporated to the SLAM system, obtaining a local 2.5D map and a footprint estimation of the robot position that improves the 6D pose estimation through the EKF. We present some experimental results with two different commercial platforms, and validate the system by applying it to their position control.
Ficheros en el ítem
Ficheros | Tamaño | Formato |
|
---|---|---|---|
Multi_sensorial_Sensors_2017.pdf | 6.944Mb |
![]() |
Ficheros | Tamaño | Formato |
|
---|---|---|---|
Multi_sensorial_Sensors_2017.pdf | 6.944Mb |
![]() |
Colecciones
- ELECTRON - Artículos [152]
- ROBESAFE - Artículos [37]