EP-IT Data Science Seminars

Multi-scale cross-attention transformer encoder for event classification

by Prof. Mihoko Nojiri (Theory Center, IPNS, KEK)

Europe/Zurich
222/R-001 (CERN)

222/R-001

CERN

200
Show room on map
Description
We deploy an advanced Machine Learning (ML) environment, leveraging a multi-scale cross-attention encoder for event classification, taking gg→H→hh→bbbb process at the High Luminosity Large Hadron Collider (HL-LHC) as an example. In the boosted Higgs regime, the final state consists of two fat jets. Our multi-modal network can extract information from the jet substructure and the kinematics of the final state particles through self-attention transformer layers. The learned information is subsequently integrated to improve classification performance using an additional transformer encoder with cross-attention heads. We demonstrate that our approach surpasses in performance current alternative ML methods, whether solely based on kinematic analysis or else on a combination of this with mainstream ML approaches. Then, we employ various interpretive methods to evaluate the network results, including attention map analysis and visual representation of Gradient-weighted Class Activation Mapping (Grad-CAM). The proposed network is generic and can be applied to analyse any process carrying information at different scales.
 
Mihoko Nojiri is a theorist in Theory Center, KEK, currently working on ML application in HEP. She is also one of the PIs of "Foundation of "Machine Learning Physics"(https://mlphys.scphys.kyoto-u.ac.jp/en/), a Grant-in-Aid for Transformative Research Aria(A), MEXT, Japan. She has various experiences of long-term collaboration with experimental groups, connecting the grant package with experimentalists. She is also involved in Diversity & Inclusion activities in Japan Physics Society and Science Council of Japan.
 

Coffee will be served at 10:30.

Organised by

M. Girone, M. Elsing, L. Moneta, M. Pierini

Webcast
There is a live webcast for this event