Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen: http://dx.doi.org/10.18419/opus-14662
Langanzeige der Metadaten
DC ElementWertSprache
dc.contributor.authorBader, Christian-
dc.contributor.authorSchwieger, Volker-
dc.date.accessioned2024-07-18T09:12:20Z-
dc.date.available2024-07-18T09:12:20Z-
dc.date.issued2024de
dc.identifier.issn1424-8220-
dc.identifier.other1895747074-
dc.identifier.urihttp://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-146813de
dc.identifier.urihttp://elib.uni-stuttgart.de/handle/11682/14681-
dc.identifier.urihttp://dx.doi.org/10.18419/opus-14662-
dc.description.abstractModern vehicles equipped with Advanced Driver Assistance Systems (ADAS) rely heavily on sensor fusion to achieve a comprehensive understanding of their surrounding environment. Traditionally, the Kalman Filter (KF) has been a popular choice for this purpose, necessitating complex data association and track management to ensure accurate results. To address errors introduced by these processes, the application of the Gaussian Mixture Probability Hypothesis Density (GM-PHD) filter is a good choice. This alternative filter implicitly handles the association and appearance/disappearance of tracks. The approach presented here allows for the replacement of KF frameworks in many applications while achieving runtimes below 1 ms on the test system. The key innovations lie in the utilization of sensor-based parameter models to implicitly handle varying Fields of View (FoV) and sensing capabilities. These models represent sensor-specific properties such as detection probability and clutter density across the state space. Additionally, we introduce a method for propagating additional track properties such as classification with the GM-PHD filter, further contributing to its versatility and applicability. The proposed GM-PHD filter approach surpasses a KF approach on the KITTI dataset and another custom dataset. The mean OSPA (2) error could be reduced from 1.56 (KF approach) to 1.40 (GM-PHD approach), showcasing its potential in ADAS perception.en
dc.description.sponsorshipThe publication of this article was funded by the Open Access fund of Universität Stuttgart.de
dc.description.sponsorshipOpen Access fund of Universität Stuttgartde
dc.language.isoende
dc.relation.uridoi:10.3390/s24082436de
dc.rightsinfo:eu-repo/semantics/openAccessde
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/de
dc.subject.ddc620de
dc.titleAdvancing ADAS perception : a sensor-parameterized mmplementation of the GM-PHD filteren
dc.typearticlede
dc.date.updated2024-06-19T17:25:10Z-
ubs.fakultaetLuft- und Raumfahrttechnik und Geodäsiede
ubs.fakultaetFakultätsübergreifend / Sonstige Einrichtungde
ubs.institutGeodätisches Institutde
ubs.institutFakultätsübergreifend / Sonstige Einrichtungde
ubs.publikation.seiten21de
ubs.publikation.sourceSensors 24 (2024), No. 2436de
ubs.publikation.typZeitschriftenartikelde
Enthalten in den Sammlungen:06 Fakultät Luft- und Raumfahrttechnik und Geodäsie

Dateien zu dieser Ressource:
Datei Beschreibung GrößeFormat 
sensors-24-02436.pdf5,06 MBAdobe PDFÖffnen/Anzeigen


Diese Ressource wurde unter folgender Copyright-Bestimmung veröffentlicht: Lizenz von Creative Commons Creative Commons