Collective variables in data-centric neural network training

dc.contributor.authorNikolaou, Konstantin
dc.date.accessioned2025-01-24T07:40:47Z
dc.date.available2025-01-24T07:40:47Z
dc.date.issued2023de
dc.description.abstractNeural Networks have become beneficial tools for physics research. While they provide a powerful tool for data-driven modeling, their success is accompanied by a lack of interpretability. This thesis aims to add transparency to the opaque nature of NNs by means of collective variables, a concept well-known in the field of statistical physics. Three collective variables are introduced that emerge from the interactions between neurons and data. These observables enable one to capture holistic behavior of the network and are used to conduct an analysis of neural network training, focusing on data. Through the investigations, the collective variables are applied to selections from a novel sampling method: Random Network Distillation (RND). Besides studying collective variables, the investigation of Random Network Distillation as a data selection method composes the second part of this thesis. The method is analyzed and optimized with respect to its components, aiming to understand and improve the data selection process. It is shown that RND can be used to select data sets that are beneficial for neural network training, giving rise to its application in fields like active learning. The collective variables are leveraged to further investigate the selection method and its effect on neural network training, revealing previously unknown properties of RND-selected data sets. The potential of the collective variables is demonstrated and discussed from a data-centric perspective. They are shown to be discriminative towards the information content of data and give rise to novel insights into the nature of neural network training. In addition to fundamental research on neural networks, the collective variables offer several potential applications including the identification of adversarial attacks and facilitating neural architecture search.en
dc.identifier.other1916508413
dc.identifier.urihttp://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-155813de
dc.identifier.urihttp://elib.uni-stuttgart.de/handle/11682/15581
dc.identifier.urihttps://doi.org/10.18419/opus-15562
dc.language.isoende
dc.rightsinfo:eu-repo/semantics/openAccessde
dc.subject.ddc530de
dc.titleCollective variables in data-centric neural network trainingen
dc.typemasterThesisde
ubs.fakultaetMathematik und Physikde
ubs.institutInstitut für Computerphysikde
ubs.publikation.seitenxiii, 122de
ubs.publikation.typAbschlussarbeit (Master)de

Files

Original bundle

Now showing 1 - 1 of 1
Thumbnail Image
Name:
knikolaou_master_thesis.pdf
Size:
2.46 MB
Format:
Adobe Portable Document Format
Description:

License bundle

Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
3.3 KB
Format:
Item-specific license agreed upon to submission
Description: