Real-Time Bird's Eye View Multi-Object Tracking system based on Fast Encoders for object detection
AuthorsGómez Huélamo, Carlos; Egido Sierra, Javier del; Bergasa Pascual, Luis Miguel; Barea Navarro, Rafael; Ocaña Miguel, Manuel; [et al.]
IdentifiersPermanent link (URI): http://hdl.handle.net/10017/45389
Agencia Estatal de Investigación
Comunidad de Madrid
Gómez Huélamo, C., Egido, J. del, Bergasa, L. M., Barea, R., Ocaña, M. Arango, F. & Gutiérrez Moreno, R. 2020, "Real-Time Bird’s Eye View Multi-Object Tracking system based on Fast Encoders for object detection", en 2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC), Rhodes, Greece, 2020, pp. 1-6.
3D Multi-Object Tracking
Description / Notes
2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC), September 20-23, 2020, Rhodes, Greece. Virtual Conference.
info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/RTI2018-099263-B-C21/ES/TECNOLOGIAS ROBUSTAS PARA UN CONCEPTO DE COCHE ELECTRICO AUTOMATIZADO PARA CONDUCTORES MAYORES/
info:eu-repo/grantAgreement/CAM//P2018%2FNMT-4331/ES/Madrid Robotics Digital Innovation Hub/RoboCity2030-DIH-CM
Attribution-NonCommercial-NoDerivatives 4.0 Internacional
© 2020 IEEE
This paper presents a Real-Time Bird’s Eye View Multi Object Tracking (MOT) system pipeline for an Autonomous Electric car, based on Fast Encoders for object detection and a combination of Hungarian algorithm and Bird’s Eye View (BEV) Kalman Filter, respectively used for data association and state estimation. The system is able to analyze 360 degrees around the ego-vehicle as well as estimate the future trajectories of the environment objects, being the essential input for other layers of a self-driving architecture, such as the control or decision-making. First, our system pipeline is described, merging the concepts of online and realtime DATMO (Deteccion and Tracking of Multiple Objects), ROS (Robot Operating System) and Docker to enhance the integration of the proposed MOT system in fully-autonomous driving architectures. Second, the system pipeline is validated using the recently proposed KITTI-3DMOT evaluation tool that demonstrates the full strength of 3D localization and tracking of a MOT system. Finally, a comparison of our proposal with other state-of-the-art approaches is carried out in terms of performance by using the mainstream metrics used on MOT benchmarks and the recently proposed integral MOT metrics, evaluating the performance of the tracking system over all detection thresholds.