Accelerator Seminar: Alexander Scheinker

Alexander Scheinker
Conference Date
to
Conference Location

TL 1227

Title: Robust, Adaptive and Physics Constrained Generative Deep Learning for Electrodynamics

Speaker: Alexander Scheinker (Los Alamos National Lab)

Abstract: Standard machine learning (ML) methods struggle with time-varying systems or systems with distribution shift and rely on brute-force retraining. For many interesting problems retraining is not an option. For example, for particle accelerators the beam measurement process is slow and destructive and would interrupt regular operations. Furthermore, standard ML methods are not robust and do not have guarantees on convergence or prediction accuracy. This talk discusses the incorporation of model-independent adaptive feedback and hard physics constraints within generative deep learning methods to make them more robust. The talk covers 3D encoder-decoder convolutional neural networks, for various electrodynamics applications, and how to make them more robust for time-varying systems. The improved robustness of this adaptive ML approach is shown using experimental data from the HiRES compact electron accelerator at LBNL for creating virtual diagnostics of an electron beam’s 6D phase space [1]. A novel approach to hard coding physics constraints within the structures of 3D convolutional neural networks is shown, which unlike the popular PINNs approach guarantees hard physics constraints such as building neural operators that respect Maxwell’s equations [2]. Adaptive feedback coupled with generative 3D convolutional neural networks is also discussed for speeding up 3D coherent diffraction imaging reconstruction techniques [3].

https://www.jlab.org/accelerator-seminars

Zoom Link