Show simple item record

dc.contributor.authorLópez Guillén, María Elena 
dc.contributor.authorGarcía Gonzalo, Sergio 
dc.contributor.authorBarea Navarro, Rafael 
dc.contributor.authorBergasa Pascual, Luis Miguel 
dc.contributor.authorMolinos Vicente, Eduardo José 
dc.contributor.authorArroyo Contera, Roberto 
dc.contributor.authorRomera Carmena, Eduardo 
dc.contributor.authorPardo Alia, Samuel 
dc.date.accessioned2020-06-12T10:12:31Z
dc.date.available2020-06-12T10:12:31Z
dc.date.issued2017-04-08
dc.identifier.bibliographicCitationLópez, E., García, S., Barea, R., Bergasa, L.M., Molinos, E.J., Arroyo, R., Romera, E.& Pardo, S. 2017, "A multi-sensorial Simultaneous Localization and Mapping (SLAM) system for low-cost micro aerial vehicles in GPS-denied environments", Sensors 2017, 17, 802
dc.identifier.issn1424-8220
dc.identifier.urihttp://hdl.handle.net/10017/43211
dc.description.abstractOne of the main challenges of aerial robots navigation in indoor or GPS-denied environments is position estimation using only the available onboard sensors. This paper presents a Simultaneous Localization and Mapping (SLAM) system that remotely calculates the pose and environment map of different low-cost commercial aerial platforms, whose onboard computing capacity is usually limited. The proposed system adapts to the sensory configuration of the aerial robot, by integrating different state-of-the art SLAM methods based on vision, laser and/or inertial measurements using an Extended Kalman Filter (EKF). To do this, a minimum onboard sensory configuration is supposed, consisting of a monocular camera, an Inertial Measurement Unit (IMU) and an altimeter. It allows to improve the results of well-known monocular visual SLAM methods (LSD-SLAM and ORB-SLAM are tested and compared in this work) by solving scale ambiguity and providing additional information to the EKF. When payload and computational capabilities permit, a 2D laser sensor can be easily incorporated to the SLAM system, obtaining a local 2.5D map and a footprint estimation of the robot position that improves the 6D pose estimation through the EKF. We present some experimental results with two different commercial platforms, and validate the system by applying it to their position control.en
dc.description.sponsorshipComunidad de Madrides_ES
dc.description.sponsorshipUniversidad de Alcaláes_ES
dc.format.mimetypeapplication/pdfen
dc.language.isoengen
dc.publisherMDPI
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectSensor fusionen
dc.subjectSLAMen
dc.subjectAerial robotsen
dc.titleA multi-sensorial Simultaneous Localization and Mapping (SLAM) system for low-cost micro aerial vehicles in GPS-denied environmentsen
dc.typeinfo:eu-repo/semantics/articleen
dc.subject.ecienciaElectrónicaes_ES
dc.subject.ecienciaElectronicsen
dc.contributor.affiliationUniversidad de Alcalá. Departamento de Electrónicaes_ES
dc.relation.publisherversionhttps://doi.org/10.3390/s17040802
dc.type.versioninfo:eu-repo/semantics/publishedVersionen
dc.identifier.doi10.3390/s17040802
dc.relation.projectIDinfo:eu-repo/grantAgreement/CAM//S2013%2FMIT-2748/ES/ROBOTICA APLICADA A LA MEJORA DE LA CALIDAD DE VIDA DE LOS CIUDADANOS, FASE III/RoboCity2030-III-CMen
dc.relation.projectIDinfo:eu-repo/grantAgreement/UAH//CCG2016%2FEXP-049en
dc.rights.accessRightsinfo:eu-repo/semantics/openAccesses_ES
dc.identifier.publicationtitleSensors


Files in this item

Thumbnail

This item appears in the following Collection(s)

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Este ítem está sujeto a una licencia Creative Commons.