Du

Horaire

Lieu LISN Site Plaine - Digitéo

AAC, Science des Données, Thèses et HDR

Natural gradients and kernel methods for Physics Informed Neural Networks (PINNs)

Soutenance de thèse. Encadrant de thèse : Cyril Furtlehner, chargé de recherche, INRIA Saclay, LISN.

Orateur : Nilo Schwencke

Jury

  • Emmanuel FRANK, Chargé de recherche, INRIA Nancy-Grand Est, France, Reviewer & Examiner
  • Olga MULA, Full Professor, Universität Wien, Austria, Reviewer & Examiner (online)
  • Francis BACH, Directeur de recherche, INRIA Paris, France, Examiner
  • Claire BOYER, Professeure des Universités, Université Paris-Saclay, France, Examiner
  • Victor MICHEL-DANSAC, Chargé de recherche, INRIA Nancy-Grand Est, France, Examiner
  • Cyril Furtlehner, Chargé de recherche, INRIA Saclay, TAU team, Supervisor
  • Alena SHILOVA, Chargé de recherche, INRIA Saclay, France, Invited co-autor
  • Roland MAIER, Junior Professor, Karlsruhe Institute of Technology, Germany, Invited co-autor

Abstract

Physics-Informed Neural Networks (PINNs) have emerged in recent years as a promising paradigm for solving partial differential equations (PDEs) by embedding physical constraints directly into the training of neural networks. Despite their conceptual appeal and rapid adoption across scientific and engineering domains, PINNs often suffer from limited accuracy and robustness compared to classical numerical methods. These limitations have motivated a rich body of research on algorithmic refinements, improved training strategies, and theoretical analyses of their behavior. The objective of this PhD is to advance this line of research along two complementary directions. On the algorithmic side, our goal is to design more efficient and accurate training schemes for PINNs by hybridizing tools from kernel methods and natural gradient optimization. On the theoretical side, we aim to anchor PINNs within a rigorous mathematical framework. By situating them in the language of reproducing kernel Hilbert spaces (RKHS), operator theory, and spectral analysis, we seek to clarify their structure, relate them to existing approximation and variational methods, and foster a deeper mathematical understanding of PINNs. We present our contributions across three papers.

  1. ANaGRAM: We first establish a novel connection between kernel methods and natural gradient optimization, leading to the notion of the empirical natural gradient. Building upon this framework, we introduce the ANaGRAM algorithm, a new PINN training scheme that systematically exploits this connection. This yields improved numerical performance while also providing a principled link between natural gradient methods and Green’s functions.
  2. AMStramGRAM: Our second contribution studies the dynamics of ANaGRAM. We analyze the role of regularization by spectral cutoffs, which we reinterpret in terms of enforcing isometries and relate to the theory of Green’s functions and reproducing kernels. This perspective clarifies why cutoffs improve stability and accuracy, and motivates a new algorithm with adaptive cutoffs, which adjusts the degree of regularization dynamically during training. We call this refined method AMStramGRAM.
  3. Kernelization of Weak Formulations: The third contribution is of a more theoretical nature. We show that weak and strong solutions of PDEs can both be naturally expressed within the framework of Hilbert Riggings, and we establish that these two notions in fact describe the same underlying object, viewed through distinct Riggings. In particular, we reinterpret weak formulations as least-squares methods, which allows us to revisit Galerkin approaches from a kernel-based perspective and to demonstrate how the Natural Neural Tangent Kernel (NNTK) provides a unifying framework bridging PINNs and classical variational formulations. This result highlights a deep structural connection between neural PDE solvers and approximation theory.

Taken together, these contributions advance both the practical efficiency and the theoretical foundations of PINNs. They underscore that kernel-based and geometric viewpoints are not only effective tools for improving performance, but also provide the right language for integrating PINNs into the broader landscape of numerical and functional analysis.

The online session is avaible here

Contact

Lieu de l'événement