Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen:
http://dx.doi.org/10.18419/opus-13470
Autor(en): | Holzmüller, David |
Titel: | Regression from linear models to neural networks : double descent, active learning, and sampling |
Erscheinungsdatum: | 2023 |
Dokumentart: | Dissertation |
Seiten: | xii, 255 |
URI: | http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-134895 http://elib.uni-stuttgart.de/handle/11682/13489 http://dx.doi.org/10.18419/opus-13470 |
Bemerkungen: | This thesis contains three papers: https://arxiv.org/abs/2010.01851 and https://arxiv.org/abs/2203.09410 and https://arxiv.org/abs/2303.03237 |
Zusammenfassung: | Regression, that is, the approximation of functions from (noisy) data, is a ubiquitous task in machine learning and beyond. In this thesis, we study regression in three different settings. First, we study the double descent phenomenon in non-degenerate unregularized linear regression models, proving that these models are always very noise-sensitive when the number of parameters is close to the number of samples. Second, we study batch active learning algorithms for neural network regression from a more applied perspective: We introduce a framework for building existing and new algorithms and provide a large-scale benchmark showing that a new algorithm can achieve state-of-the-art performance. Third, we study convergence rates for non-log-concave sampling and log-partition estimation algorithms, including approximation-based methods, and prove many results on optimal rates, efficiently achievable rates, multi-regime behaviors, reductions, and the relation to optimization. |
Enthalten in den Sammlungen: | 08 Fakultät Mathematik und Physik |
Dateien zu dieser Ressource:
Datei | Beschreibung | Größe | Format | |
---|---|---|---|---|
dissertation_holzmueller.pdf | Original print version (two-sided) | 3,26 MB | Adobe PDF | Öffnen/Anzeigen |
Alle Ressourcen in diesem Repositorium sind urheberrechtlich geschützt.