2019 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC) | 2019

Machine Learning at the Edge for Ultra High Rate Detectors

 
 
 
 
 

Abstract


Several large physics experiments face an increasingly large data firehose . Raw data generation exceeds TB/s rates for several existing and planned experiments, generating untenable data sets in very little time. The data often contain limited information and extracting this relevant information online would reduce the offline storage requirements by several orders of magnitude. Additionally, ultra low latency data analysis can be used to drive a fast feedback control system to adjust the experiment in real time, including decisions on data acquisition conditions, detector parameter adjustments and source operation modifications. However, most data state-of-the-art algorithms use computationally expensive operations and require uploading the data to a CPU or GPU compute node. With appropriate training, machine learning can categorize data samples and extract relevant information from raw data using simple arithmetic operations. Placing these fast inference models on FPGAs near the detector— at the edge—would reduce the data velocity at the source. We demonstrate this approach with an initial proof of concept targeting the CookieBox, an angular streaking detector developed for LCLS-II placed upstream as an online beam diagnostic tool. Data is streamed to the FPGA where a parallel and pipelined inference model extracts the relevant information in less than 20 µs at a streaming rate of 77 million events per second.

Volume None
Pages 1-4
DOI 10.1109/NSS/MIC42101.2019.9059671
Language English
Journal 2019 IEEE Nuclear Science Symposium and Medical Imaging Conference (NSS/MIC)

Full Text