08 Fakultät Mathematik und Physik
Permanent URI for this collectionhttps://elib.uni-stuttgart.de/handle/11682/9
Browse
2 results
Search Results
Item Open Access Collective variables in data-centric neural network training(2023) Nikolaou, KonstantinNeural Networks have become beneficial tools for physics research. While they provide a powerful tool for data-driven modeling, their success is accompanied by a lack of interpretability. This thesis aims to add transparency to the opaque nature of NNs by means of collective variables, a concept well-known in the field of statistical physics. Three collective variables are introduced that emerge from the interactions between neurons and data. These observables enable one to capture holistic behavior of the network and are used to conduct an analysis of neural network training, focusing on data. Through the investigations, the collective variables are applied to selections from a novel sampling method: Random Network Distillation (RND). Besides studying collective variables, the investigation of Random Network Distillation as a data selection method composes the second part of this thesis. The method is analyzed and optimized with respect to its components, aiming to understand and improve the data selection process. It is shown that RND can be used to select data sets that are beneficial for neural network training, giving rise to its application in fields like active learning. The collective variables are leveraged to further investigate the selection method and its effect on neural network training, revealing previously unknown properties of RND-selected data sets. The potential of the collective variables is demonstrated and discussed from a data-centric perspective. They are shown to be discriminative towards the information content of data and give rise to novel insights into the nature of neural network training. In addition to fundamental research on neural networks, the collective variables offer several potential applications including the identification of adversarial attacks and facilitating neural architecture search.Item Open Access Quantum machine learning for time series prediction(2024) Fellner, TobiasTime series prediction is an essential task in various fields, such as meteorology, finance and healthcare. Traditional approaches to time series prediction have primarily relied on regression and moving average methods, but recent advancements have seen a growing interest in applying machine learning techniques. With the rise of quantum computing, it is of interest to explore whether quantum machine learning can offer advantages over classical methods for time series forecasting. This thesis presents the first large-scale systematic benchmark comparing classical and quantum models for time series prediction. A variety of quantum models are evaluated against classical counterparts on different datasets. A novel quantum reservoir computing architecture is proposed, demonstrating promising results in handling nonlinear prediction tasks. The findings suggest that, for simpler time series prediction tasks, quantum models achieve accuracy comparable to classical methods. However, for more complex tasks, such as long-term forecasting, certain quantum models show improved performance. While current quantum machine learning models do not consistently outperform classical approaches, the results point to specific contexts where quantum methods may be beneficial.