Transformer-based models to fast-simulate the LHCb ECAL detector


Keywords
tensorflow, machine, learning, deep, transformer, lhcb, experiment, lamarr, ultra-fast, simulation, calorimeter, deep-learning, lhcb-experiment, lhcb-lamarr, machine-learning, ultrafast-simulation
License
xpp
Install
pip install calotron==0.0.12

Documentation

calotron logo

Transformer-based models to fast-simulate the LHCb ECAL detector

TensorFlow versions Python versions PyPI - Version GitHub - License

GitHub - Tests Codecov

GitHub - Style Code style: black

Transformer

The Transformer architecture is freely inspired by Vaswani et al. [arXiv:1706.03762] and Dosovitskiy et al. [arXiv:2010.11929].

calotron transformer architecture

Discriminator

The Discriminator is implemented through the Deep Sets model proposed by Zaheer et al. [arXiv:1703.06114] and its architecture is freely inspired by what developed by the ATLAS Collaboration for flavor tagging [ATL-PHYS-PUB-2020-014].

calotron discriminator architecture

Credits

Transformer implementation freely inspired by the TensorFlow tutorial Neural machine translation with a Transformer and Keras.