NHR PerfLab Seminar: Domain Decomposition-Based Preconditioners and Neural Networks (February 3, hybrid)
Topic: Domain Decomposition-Based Preconditioners and Neural Networks
Speaker: Prof. Dr. Alexander Heinlein, TU Delft
Date and time: Tuesday, February 3, 2026, at 2:00 p.m. CET
Location: Seminar Room 2.049 (RRZE) and online via Zoom
Abstract:
This talk highlights how domain decomposition methods—rooted in ideas introduced by Schwarz in the 19th century and later developed into numerical algorithms by Lions in the 1980s—remain highly relevant for modern high-performance computing and scientific machine learning. The central principle is to decompose a global computational domain into subdomains, thereby splitting large-scale problems into smaller, local subproblems that can be solved efficiently and in parallel. We focus on modern overlapping Schwarz preconditioners as implemented in the FROSch (Fast and Robust Overlapping Schwarz) package of the Trilinos library, which have demonstrated robustness and scalability for a wide range of challenging applications on contemporary CPU and GPU architectures. Beyond their classical role in numerical solvers for partial differential equations, we also explore how domain decomposition techniques can be employed to localize neural networks and operator-learning architectures, introducing sparsity, improving scalability, and enhancing training in multiscale settings. The underlying algorithmic ideas are validated on representative test problems, including diffusion, wave propagation, and flow problems, illustrating how domain decomposition provides a unifying framework bridging large-scale numerical simulation and modern scientific machine learning.

Short bio:
Alexander Heinlein is an Assistant Professor at Delft University of Technology (TU Delft). He completed his PhD at the University of Duisburg-Essen and the University of Cologne, followed by several years as a postdoctoral researcher in Cologne. After serving as Acting Full Professor of Numerical Mathematics for High-Performance Computing at the University of Stuttgart, he joined TU Delft. His research focuses on numerical methods for partial differential equations and scientific computing, with a particular emphasis on domain decomposition and multiscale methods. More recently, his work has centered on scientific machine learning, aiming at the tight integration of neural networks with classical numerical solvers for complex, large-scale problems, including implementations on modern high-performance computing architectures.
For a list of past and upcoming NHR PerfLab seminar events, please see: https://hpc.fau.de/research/nhr-perflab-seminar-series/

