Verlagslink: | http://core.informatik.haw-hamburg.de/bib/eigene/jk-otdee-14.pdf | Verlagslink DOI: | 10.1109/ITSC.2014.6958060 | Titel: | Object tracking and dynamic estimation on evidential grids | Sprache: | Englisch | Autorenschaft: | Jungnickel, Ruben Korf, Franz |
Herausgeber: | Institute of Electrical and Electronics Engineers | Erscheinungsdatum: | 2014 | Verlag: | IEEE | Teil der Schriftenreihe: | 2014 IEEE 17th International Conference on Intelligent Transportation Systems (ITSC 2014) : Qingdao, China, 8 - 11 October 2014 | Anfangsseite: | 2310 | Endseite: | 2316 | Konferenz: | IEEE International Conference on Intelligent Transportation Systems 2014 | Zusammenfassung: | Autonomous driving is one of the most challenging tasks of the automotive industry. As a subtask, the estimation of driveable and non driveable space is often solved by applying occupancy grids. The information about non driveable space can be used to improve object tracking. This paper presents an approach for object tracking and modelling in an occupancy grid map. Tracking objects on grid cells yields the advantage of a consistent environmental model on the occupancy grid map. We introduce the occupancy grid map as the only information source for the object tracking module. Taking advantage of the Dempster Shafer theory, a dynamic belief of conflicting cells can be estimated. This dynamic belief is then accumulated in a tracked object model. This is a grid based free form object model that uses detached grid cells to model vehicles in urban environment. We reduce false positives and initialization time by maintaining a dynamic belief for each object. |
URI: | http://hdl.handle.net/20.500.12738/1097 | ISBN: | 978-1-4799-6078-1 978-1-4799-6079-8 1-4799-6079-9 |
ISSN: | 2153-0017 | Einrichtung: | Department Informatik Fakultät Technik und Informatik |
Dokumenttyp: | Konferenzveröffentlichung |
Enthalten in den Sammlungen: | Publications without full text |
Zur Langanzeige
Volltext ergänzen
Feedback zu diesem Datensatz
Export
Alle Ressourcen in diesem Repository sind urheberrechtlich geschützt.