Real-time semantic segmentation for fisheye urban driving images based on ERFNet
Author
Bergasa Pascual, Luis M.Identifiers
Enlace permanente (URI): http://hdl.handle.net/10017/43126DOI: 10.3390/s19030503
ISSN: 1424-8220
Publisher
MDPI
Date
2019-01-25Patrocinadores
Ministerio de Economía y Competitividad
Comunidad de Madrid
Dirección General de Tráfico
Cita bibliográfica
Sáez, Á., Bergasa, L. M., López-Guillén, E., Romera, E., Tradacete, M., Gómez-Huélamo, C., & Del Egido, J. 2019, "Real-Time Semantic Segmentation for Fisheye Urban Driving Images Based on ERFNet". Sensors (Basel, Switzerland), 19(3), 503. doi: 10.3390/s19030503
Palabras clave
Fisheye
Intelligent vehicles
CNN
Deep learning
Distortion
Proyecto
info:eu-repo/grantAgreement/MINECO//TRA2015-70501-C2-1-R/ES/VEHICULO INTELIGENTE PARA PERSONAS MAYORES/
SPIP2017-02305 (Dirección General de Tráfico)
S2013/MIT-2748 (Comunidad de Madrid)
Tipo de documento
info:eu-repo/semantics/article
Versión
info:eu-repo/semantics/publishedVersion
Versión del editor
https://doi.org/10.3390/s19030503Derechos
Attribution-NonCommercial-NoDerivatives 4.0 Internacional
Derechos de acceso
info:eu-repo/semantics/openAccess
Abstract
The interest in fisheye cameras has recently risen in the autonomous vehicles field, as they are
able to reduce the complexity of perception systems while improving the management of dangerous driving situations. However, the strong distortion inherent to these cameras makes the usage of conventional computer vision algorithms difficult and has prevented the development of these devices. This paper presents a methodology that provides real-time semantic segmentation on fisheye cameras leveraging only synthetic images. Furthermore, we propose some Convolutional Neural Networks (CNN) architectures based on Efficient Residual Factorized Network (ERFNet) that demonstrate notable skills handling distortion and a new training strategy that improves the segmentation on the image borders. Our proposals are compared to similar state-of-the-art works showing an outstanding performance and tested in an unknown real world scenario using a fisheye camera integrated in an open-source autonomous electric car, showing a high domain adaptation capability.
Files in this item
Files | Size | Format |
|
---|---|---|---|
RealTime_Alvaro_Sensors_2019.pdf | 77.36Mb |
![]() |
Files | Size | Format |
|
---|---|---|---|
RealTime_Alvaro_Sensors_2019.pdf | 77.36Mb |
![]() |
Collections
- ROBESAFE - Artículos [37]