Please use this identifier to cite or link to this item: http://dx.doi.org/10.18419/opus-13755
Authors: Czech, Phillip
Braun, Markus
Kreßel, Ulrich
Yang, Bin
Title: Behavior-aware pedestrian trajectory prediction in ego-centric camera views with spatio-temporal ego-motion estimation
Issue Date: 2023
metadata.ubs.publikation.typ: Zeitschriftenartikel
metadata.ubs.publikation.seiten: 957-978
metadata.ubs.publikation.source: Machine learning and knowledge extraction 5 (2023), S. 957-978
URI: http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-137744
http://elib.uni-stuttgart.de/handle/11682/13774
http://dx.doi.org/10.18419/opus-13755
ISSN: 2504-4990
Abstract: With the ongoing development of automated driving systems, the crucial task of predicting pedestrian behavior is attracting growing attention. The prediction of future pedestrian trajectories from the ego-vehicle camera perspective is particularly challenging due to the dynamically changing scene. Therefore, we present Behavior-Aware Pedestrian Trajectory Prediction (BA-PTP), a novel approach to pedestrian trajectory prediction for ego-centric camera views. It incorporates behavioral features extracted from real-world traffic scene observations such as the body and head orientation of pedestrians, as well as their pose, in addition to positional information from body and head bounding boxes. For each input modality, we employed independent encoding streams that are combined through a modality attention mechanism. To account for the ego-motion of the camera in an ego-centric view, we introduced Spatio-Temporal Ego-Motion Module (STEMM), a novel approach to ego-motion prediction. Compared to the related works, it utilizes spatial goal points of the ego-vehicle that are sampled from its intended route. We experimentally validated the effectiveness of our approach using two datasets for pedestrian behavior prediction in urban traffic scenes. Based on ablation studies, we show the advantages of incorporating different behavioral features for pedestrian trajectory prediction in the image plane. Moreover, we demonstrate the benefit of integrating STEMM into our pedestrian trajectory prediction method, BA-PTP. BA-PTP achieves state-of-the-art performance on the PIE dataset, outperforming prior work by 7% in MSE-1.5 s and CMSE as well as 9% in CFMSE.
Appears in Collections:05 Fakultät Informatik, Elektrotechnik und Informationstechnik

Files in This Item:
File Description SizeFormat 
make-05-00050-v2.pdf30,35 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons