Contact Person: Stefan Hoeche

Monte Carlo Simulations

Monte Carlos Event

The particle physics experiments at the Large Hadron Collider (LHC) at CERN are some of the largest scientific endeavors of the late 20th and early 21st century.  The analysis of their data is hampered by the omnipresent effects of the strong nuclear force. Hard collisions between the incoming protons at the LHC lead to collimated sprays of strongly interacting particles, which are called jets when observed in a detector. Jets are produced copiously, and they form a background noise that can cover up the subtle signals of interesting new phenomena. In order to discover something new, one needs a tool to predict the structure of both the noise and the signal as it would appear in a measurement. This can be likened to a computer program that generates large sets of random images including a person, so one can test the face recognition of digital cameras. For particle physicists, the camera is the ATLAS or CMS detector, and the tool to generate random images is a Monte-Carlo event generator.

In order to generate realistic predictions, event generators must encapsulate the combined knowledge of several decades of particle physics theory and experiment. A review of the state of the art can be found here. In particular, event generators rely on various assumptions to cope with the high dimensionality of the multi-particle phase space and with the non-abelian, nonlinear nature of Quantum Chromodynamics (QCD). Dependence on parameters of the models used to describe the transition from short to long distances is one of the main uncertainties affecting measurements and searches at the LHC. Whenever possible such dependence must be reduced or eliminated. The work of the SLAC theory group is therefore largely focused on  the creation of more precise Monte-Carlo event generators.

Matching and Merging Methods

Inclusive Jet Multiplicity

The simulation of QCD jet production in event generators is based on perturbation theory, with calculations carried out as an expansion in the strong coupling constant αs. In order to produce hadron level events suitable for passing to a detector simulation, certain additional terms must be summed to all orders, using so-called parton shower algorithms. These algorithms are universal, but only approximate the exact higher-order result for any given reaction, giving rise to large uncertainties. A major step in reducing these uncertainties was achieved by matching and merging algorithms, which have been partially developed by the SLAC theory group. Using these methods, the parton shower approximation is replaced by exact fixed-order higher-order perturbative QCD calculations, whenever those are available. The first application of this technique the LHC was to the production of W and Z bosons. It allows to reduce theoretical uncertainties by a large amount, making them comparable or smaller than the experimental uncertainties.

Parton Showers

Dijet Azimuthal Decorrelations

The SLAC theory group is also active in the development of novel parton showers. QCD Bremsstrahlung behaves slightly different in configurations where partons become collinear (they travel in the same direction) compared to configurations where gluons become soft (they carry almost zero energy). Combining these two limits consistently in Monte-Carlo simulations has been a bottleneck for several decades, but it is crucial to the proper description of intra-jet activity in collider experiments. We have proposed several new algorithms based on so-called dipole factorization, which tackle the problem, and which bring parton-shower predictions in close agreement with experimental data for various key observables, even before the parton shower has been improved using fixed-order higher-order calculations.

Scientific Discovery Through Advanced Computing

The efficient use of computing resources at large scale is crucial for the application of Monte-Carlo simulations to practical problems in high-energy physics. We are therefore invested in the development of algorithms that scale from laptops to supercomputers and that target the next generation of computing hardware. We partner with Fermi National Accelerator Laboratory and Argonne National Laboratory on this exciting endeavour. More information on the project can be found on this page.