At the CERN Large Hadron Collider (LHC), millions of protons collide every second, generating a data influx of 63 Tb/s: the whole extent of the collected data cannot be stored for later analysis. Moreover, the upcoming LHC upgrade will make its data throughput equivalent to 5% of the daily internet traffic. Thus, efficient real-time event filtering systems are employed to decide which collisions should be recorded. These systems constitute of ultrafast, highly efficient machine learning algorithms that are deployed on specialised hardware, namely field programmable gate arrays (FPGAs).
In this talk, I will give an overview of how real-time machine learning architectures can be deployed on FPGAs to achieve O(100) nanosecond inference times and keep up with the LHC data throughput, while maintaining high accuracy for the tasks of precision measurements and new physics discovery. Finally, we will discuss geometric learning applications for fast machine learning and how anomaly detection can be used to discover new physics at the LHC.
ZOOM ID: https://psich.zoom.us/j/65342345094
Laboratory for Simulation and Modeling