Real-Time Bird's Eye View Multi-Object Tracking system based on Fast Encoders for object detection
Authors
Gómez Huélamo, CarlosIdentifiers
Permanent link (URI): http://hdl.handle.net/10017/45389DOI: 10.1109/ITSC45102.2020.9294737
ISBN: 978-1-7281-4149-7
Publisher
IEEE
Date
2020-12-24Funders
Agencia Estatal de Investigación
Comunidad de Madrid
Bibliographic citation
Gómez Huélamo, C., Egido, J. del, Bergasa, L. M., Barea, R., Ocaña, M. Arango, F. & Gutiérrez Moreno, R. 2020, "Real-Time Bird’s Eye View Multi-Object Tracking system based on Fast Encoders for object detection", en 2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC), Rhodes, Greece, 2020, pp. 1-6.
Keywords
3D Multi-Object Tracking
ROS
Real-Time
Evaluation Metrics
Description / Notes
2020 IEEE 23rd International Conference on Intelligent Transportation Systems (ITSC), September 20-23, 2020, Rhodes, Greece. Virtual Conference.
Project
info:eu-repo/grantAgreement/AEI/Plan Estatal de Investigación Científica y Técnica y de Innovación 2017-2020/RTI2018-099263-B-C21/ES/TECNOLOGIAS ROBUSTAS PARA UN CONCEPTO DE COCHE ELECTRICO AUTOMATIZADO PARA CONDUCTORES MAYORES/
info:eu-repo/grantAgreement/CAM//P2018%2FNMT-4331/ES/Madrid Robotics Digital Innovation Hub/RoboCity2030-DIH-CM
Document type
info:eu-repo/semantics/conferenceObject
Version
info:eu-repo/semantics/acceptedVersion
Publisher's version
https://doi.org/10.1109/ITSC45102.2020.9294737Rights
Attribution-NonCommercial-NoDerivatives 4.0 Internacional
© 2020 IEEE
Access rights
info:eu-repo/semantics/openAccess
Abstract
This paper presents a Real-Time Bird’s Eye View
Multi Object Tracking (MOT) system pipeline for an Autonomous Electric car, based on Fast Encoders for object
detection and a combination of Hungarian algorithm and
Bird’s Eye View (BEV) Kalman Filter, respectively used for
data association and state estimation. The system is able to
analyze 360 degrees around the ego-vehicle as well as estimate
the future trajectories of the environment objects, being the
essential input for other layers of a self-driving architecture,
such as the control or decision-making. First, our system
pipeline is described, merging the concepts of online and realtime DATMO (Deteccion and Tracking of Multiple Objects),
ROS (Robot Operating System) and Docker to enhance the
integration of the proposed MOT system in fully-autonomous
driving architectures. Second, the system pipeline is validated
using the recently proposed KITTI-3DMOT evaluation tool that
demonstrates the full strength of 3D localization and tracking
of a MOT system. Finally, a comparison of our proposal with
other state-of-the-art approaches is carried out in terms of
performance by using the mainstream metrics used on MOT
benchmarks and the recently proposed integral MOT metrics,
evaluating the performance of the tracking system over all
detection thresholds.
Files in this item
Files | Size | Format |
|
---|---|---|---|
RealTime_Gomez_ITSC_2020.pdf | 529.7Kb |
![]() |
Files | Size | Format |
|
---|---|---|---|
RealTime_Gomez_ITSC_2020.pdf | 529.7Kb |
![]() |