05 Fakultät Informatik, Elektrotechnik und Informationstechnik
Permanent URI for this collectionhttps://elib.uni-stuttgart.de/handle/11682/6
Browse
3 results
Search Results
Item Open Access Deep open set recognition using dynamic intra-class splitting(2020) Schlachter, Patrick; Liao, Yiwen; Yang, BinThis paper provides a generic deep learning method to solve open set recognition problems. In open set recognition, only samples of a limited number of known classes are given for training. During inference, an open set recognizer must not only correctly classify samples from known classes, but also reject samples from unknown classes. Due to these specific requirements, conventional deep learning models that assume a closed set environment cannot be used. Therefore, special open set approaches were taken, including variants of support vector machines and generation-based state-of-the-art methods which model unknown classes by generated samples. In contrast, our proposed method models unknown classes by atypical subsets of training samples. The subsets are obtained through intra-class splitting (ICS). Based on a recently proposed two-stage algorithm using ICS, we propose a one-stage method based on alternating between ICS and the training of a deep neural network. Finally, several experiments were conducted to compare our proposed method with conventional and other state-of-the-art methods. The proposed method based on dynamic ICS showed a comparable or better performance than all considered existing methods regarding balanced accuracy.Item Open Access Behavior-aware pedestrian trajectory prediction in ego-centric camera views with spatio-temporal ego-motion estimation(2023) Czech, Phillip; Braun, Markus; Kreßel, Ulrich; Yang, BinWith the ongoing development of automated driving systems, the crucial task of predicting pedestrian behavior is attracting growing attention. The prediction of future pedestrian trajectories from the ego-vehicle camera perspective is particularly challenging due to the dynamically changing scene. Therefore, we present Behavior-Aware Pedestrian Trajectory Prediction (BA-PTP), a novel approach to pedestrian trajectory prediction for ego-centric camera views. It incorporates behavioral features extracted from real-world traffic scene observations such as the body and head orientation of pedestrians, as well as their pose, in addition to positional information from body and head bounding boxes. For each input modality, we employed independent encoding streams that are combined through a modality attention mechanism. To account for the ego-motion of the camera in an ego-centric view, we introduced Spatio-Temporal Ego-Motion Module (STEMM), a novel approach to ego-motion prediction. Compared to the related works, it utilizes spatial goal points of the ego-vehicle that are sampled from its intended route. We experimentally validated the effectiveness of our approach using two datasets for pedestrian behavior prediction in urban traffic scenes. Based on ablation studies, we show the advantages of incorporating different behavioral features for pedestrian trajectory prediction in the image plane. Moreover, we demonstrate the benefit of integrating STEMM into our pedestrian trajectory prediction method, BA-PTP. BA-PTP achieves state-of-the-art performance on the PIE dataset, outperforming prior work by 7% in MSE-1.5 s and CMSE as well as 9% in CFMSE.Item Open Access Avoiding shortcut-learning by mutual information minimization in deep learning-based image processing(2023) Fay, Louisa; Cobos, Erick; Yang, Bin; Gatidis, Sergios; Küstner, Thomas