Show simple item record

dc.contributor.authorBergasa Pascual, Luis Miguel 
dc.contributor.authorSaez Contreras, Álvaro
dc.contributor.authorLópez Guillén, María Elena 
dc.contributor.authorRomera Carmena, Eduardo 
dc.contributor.authorTradacete Ágreda, Miguel 
dc.contributor.authorGómez Huélamo, Carlos 
dc.contributor.authorEgido Sierra, Javier del 
dc.date.accessioned2020-06-05T10:07:32Z
dc.date.available2020-06-05T10:07:32Z
dc.date.issued2019-01-25
dc.identifier.bibliographicCitationSáez, Á., Bergasa, L. M., López-Guillén, E., Romera, E., Tradacete, M., Gómez-Huélamo, C., & Del Egido, J. 2019, "Real-Time Semantic Segmentation for Fisheye Urban Driving Images Based on ERFNet". Sensors (Basel, Switzerland), 19(3), 503. doi: 10.3390/s19030503
dc.identifier.issn1424-8220
dc.identifier.urihttp://hdl.handle.net/10017/43126
dc.description.abstractThe interest in fisheye cameras has recently risen in the autonomous vehicles field, as they are able to reduce the complexity of perception systems while improving the management of dangerous driving situations. However, the strong distortion inherent to these cameras makes the usage of conventional computer vision algorithms difficult and has prevented the development of these devices. This paper presents a methodology that provides real-time semantic segmentation on fisheye cameras leveraging only synthetic images. Furthermore, we propose some Convolutional Neural Networks (CNN) architectures based on Efficient Residual Factorized Network (ERFNet) that demonstrate notable skills handling distortion and a new training strategy that improves the segmentation on the image borders. Our proposals are compared to similar state-of-the-art works showing an outstanding performance and tested in an unknown real world scenario using a fisheye camera integrated in an open-source autonomous electric car, showing a high domain adaptation capability.en
dc.description.sponsorshipMinisterio de Economía y Competitividades_ES
dc.description.sponsorshipComunidad de Madrides_ES
dc.description.sponsorshipDirección General de Tráficoes_ES
dc.format.mimetypeapplication/pdfen
dc.language.isoengen
dc.publisherMDPI
dc.rightsAttribution-NonCommercial-NoDerivatives 4.0 Internacional*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0/*
dc.subjectFisheyeen
dc.subjectIntelligent vehiclesen
dc.subjectCNN (Convolutional Neural Network)en
dc.subjectDeep Learningen
dc.subjectDistortionen
dc.titleReal-time semantic segmentation for fisheye urban driving images based on ERFNeten
dc.typeinfo:eu-repo/semantics/articleen
dc.subject.ecienciaElectrónicaes_ES
dc.subject.ecienciaElectronicsen
dc.contributor.affiliationUniversidad de Alcalá. Departamento de Electrónicaes_ES
dc.relation.publisherversionhttps://doi.org/10.3390/s19030503
dc.type.versioninfo:eu-repo/semantics/publishedVersionen
dc.identifier.doi10.3390/s19030503
dc.relation.projectIDinfo:eu-repo/grantAgreement/MINECO//TRA2015-70501-C2-1-R/ES/VEHICULO INTELIGENTE PARA PERSONAS MAYORES/
dc.relation.projectIDinfo:eu-repo/grantAgreement/DGT//SPIP2017-02305
dc.relation.projectIDinfo:eu-repo/grantAgreement/CAM//S2013%2FMIT-2748/ES/ROBOTICA APLICADA A LA MEJORA DE LA CALIDAD DE VIDA DE LOS CIUDADANOS, FASE III/RoboCity2030-III-CM
dc.rights.accessRightsinfo:eu-repo/semantics/openAccessen
dc.identifier.publicationtitleSensors (Basel)


Files in this item

Thumbnail

This item appears in the following Collection(s)

Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Este ítem está sujeto a una licencia Creative Commons.