05 Fakultät Informatik, Elektrotechnik und Informationstechnik
Permanent URI for this collectionhttps://elib.uni-stuttgart.de/handle/11682/6
Browse
1 results
Search Results
Item Open Access Lossfunction for physics-informed machine learning in groundwater flow(2025) Aheimer, BjörnPhysics-Informed Machine Learning (PIML) methods for solving complex nonlinear partial differential equations (PDEs) have recently gained popularity in the simulation sciences. Unlike purely data-driven approaches, PIML integrates prior knowledge about the underlying physical system - expressed through the governing PDEs - into the learning process. This is done via a loss function that includes the PDE residual, thereby penalizing model outputs that do not fulfill the PDE. Hence, PIML methods are particularly appealing for scenarios with limited data availability, where extensive measurements or numerical simulation costs are prohibitive. However, purely physics-informed models often suffer from various pathologies, rendering purely physics-informed learning ineffective. One major challenge is the complex loss landscape introduced by the PDE residual. This work examines how simplifying the governing PDE can enhance training performance or enable physics-informed learning in cases where it would otherwise be infeasible. Building on the approach proposed by Piller for extending predictions of heat plumes generated by Groundwater Heat Pumps (GWHP), this study verifies the effectiveness of such PDE simplifications. Concretely, this work finds a moderate monotonic correlation (Spearman: 0.59) between the simplified PDE residual and the data loss, indicating that the simplification of the governing PDEs preserves enough of the physics to be useful while making training more tractable. To this end, Physics-Informed Neural Networks for Heat Plume Extension (HPE-PINN) and Physics-Informed-Neural Operators for Heat Plume Extension (HPE-PINO) are developed and compared against Piller's model, which makes use of Singular Value Decomposition (SVD) to reduce the dimensionality of the solution space. Throughout this process, several common PIML pathologies are encountered. A suite of techniques to mitigate their negative effects on training is presented, implemented, and validated to improve training stability and model performance. Comprehensive ablation studies highlight the effectiveness of normalization and hard enforcement of initial and boundary conditions in enhancing convergence, as well as the importance of Fourier feature embeddings to reduce spectral bias.