Bitte benutzen Sie diese Kennung, um auf die Ressource zu verweisen:
http://dx.doi.org/10.18419/opus-10237
Autor(en): | Goldt, Sebastian |
Titel: | Stochastic thermodynamics of learning |
Erscheinungsdatum: | 2018 |
Dokumentart: | Dissertation |
Seiten: | 125 |
URI: | http://nbn-resolving.de/urn:nbn:de:bsz:93-opus-ds-102546 http://elib.uni-stuttgart.de/handle/11682/10254 http://dx.doi.org/10.18419/opus-10237 |
Zusammenfassung: | Unravelling the physical limits of information processing is an important goal of non-equilibrium statistical physics. It is motivated by the search for fundamental limits of computation, such as Landauer's bound on the minimal work required to erase one bit of information. Further inspiration comes from biology, where we would like to understand what makes single cells or the human brain so (energy-)efficient at processing information. In this thesis, we analyse the thermodynamic efficiency of learning in neural networks. We first discuss the interplay of information processing and dissipation from the perspective of stochastic thermodynamics, a powerful framework to analyse the thermodynamics of strongly fluctuating systems far from equilibrium. We then show that the dissipation of any physical system, in particular a neural network, bounds the information that the network can infer from data or learn from a teacher. Along the way, we illustrate our thermodynamic bounds by looking at a number of examples and we outline directions for future research. |
Enthalten in den Sammlungen: | 08 Fakultät Mathematik und Physik |
Dateien zu dieser Ressource:
Datei | Beschreibung | Größe | Format | |
---|---|---|---|---|
Goldt_Stochastic_Thermodynamics_of_Learning.pdf | 3,05 MB | Adobe PDF | Öffnen/Anzeigen |
Alle Ressourcen in diesem Repositorium sind urheberrechtlich geschützt.