NHR PerfLab Seminar: Scaling and accelerating LLM trainings (January 15, hybrid)

Picture of a circuit board in green, with the inscription “Perflab Seminar” in front of it and the NHR@FAU logo in the left upper corner.
Image: NHR@FAU

Topic: Scaling and accelerating LLM trainings

Speaker: Andrea Pilzer, Ph.D.  (NVIDIA AI Technology Center, Italy)

Date and time: Thursday, January 15, 2026, at 1:30 p.m. CET

Slides

Abstract:
In this talk, we explore scaling laws to understand the rationale behind large-scale training. We discuss how to apply parallelization techniques effectively, ensuring they are used where they deliver the greatest benefit. Finally, we introduce low-precision training methods to maximize cluster performance and efficiency.

Short bio:
Andrea Pilzer is a Solution Architect at NVIDIA leading the NVIDIA AI Technology Center in Italy where he focuses on supporting researchers on HPC clusters and NVIDIA technology adoption. His main interests are in deep learning, video processing, VLMs and uncertainty estimation. He was postdoc at Aalto University working on uncertainty estimation for deep learning, worked at Huawei Ireland and got his Ph.D. in computer science from the University of Trento working with Nicu Sebe and Elisa Ricci.


For a list of past and upcoming NHR PerfLab seminar events, please see: https://hpc.fau.de/research/nhr-perflab-seminar-series/