Imperial College London

DrNicholasWardle

Faculty of Natural SciencesDepartment of Physics

Lecturer in Energy-Frontier Physics
 
 
 
//

Contact

 

+44 (0)20 7594 3419n.wardle09

 
 
//

Location

 

531Blackett LaboratorySouth Kensington Campus

//

Summary

 

Overview

The discovery of the Higgs boson by the ATLAS and CMS Collaborations was the greatest success of the first run of the LHC and represented a major landmark in the experimental verification of the Standard Model (SM) of particle physics. Despite the success of the SM, it is known to be an incomplete picture of physics. For example, the SM provides no explanation for the existence of dark matter (DM). Discovering Physics Beyond the SM (BSM) is key to understanding the fundamental nature of our universe. 



Precision Higgs Boson measurements

show research

Precision measurements of the properties of the Higgs boson provide a concrete test of its compatibility with the predictions of the SM. These can be interpreted under a number of scenarios for BSM physics to constrain the vast landscape of theoretical models, and potentially lead to the discovery of new physics. For example, constraints on the Higgs total decay width, in particular from searches for invisible Higgs boson decays, provide an extremely powerful constraint on theoretical models for DM, and are complementary to dedicated dark matter detection experiments. Precisely measuring Higgs boson properties, therefore, remains the primary physics goal of the (HL-)LHC. In order to achieve the best precision in these measurements, and therefore the best possible sensitivity to a wide range of BSM scenarios, it is vital that data from the LHC and HL-LHC be exploited to its fullest.

My main focus currently is the use of Effective Field Theories (EFT) to parameterise BSM physics in the Higgs sector; using precision Higgs boson measurements to constrain EFT parameters. I also perform searches for invisible Higgs boson decays, in which the Higgs boson decays to DM particles. Such decays are visible in detectors like CMS as the particles produced in association with the Higgs boson can be used to infer the energy carried away by the DM particles - the so called "missing momentum". The image below shows what such an event looks like in the detector when the Higgs boson is produced with two high energy "jets" of particles that travel close to the direction of the beamline (Vector Boson Fusion Higgs production). The purple arrow indicates the direction of the missing momentum in the event. 

VBF event display

Statistics and Machine Learning

show research

Currently, the detail in which experimental results are provided (in particular for precision Higgs boson measurements) is often limited to providing best-fit (maximum likelihood estimates) values, their uncertainties and their correlations. I research methods to provide a more complete set of information, such as simplified likelihoods, likelihood surfaces and summary statistics, to enable the re-use of LHC measurements in phenomenological studies. I am a steering committee member of the LHC Reinterpretation forum

Machine learning is an extremely important tool for exploring, exploiting and compressing data from the LHC. More recently, it is being used to explore and estimate likelihood surfaces for statistical interpretations of LHC data. I am actively researching in this area. 

The CMS L1 trigger at the HL-LHC

show research

The LHC collides bunches of protons at a rate of 40 MHz. Data from these collisions, collected at CMS, is produced at rates far in excess of those that can be analysed offline for physics results. This means that the data is processed through a “trigger” system that selects potentially interesting collisions and discards the rest. The current CMS trigger collects only a subset of the total data rate of approximately 200 Tb/s. The next phase of the LHC, the HL-HLC, is scheduled to begin in 2027. In this phase, the data rate at the CMS trigger system will increase to between 0.3 and 0.5 Pb/s. In terms of total throughput of data, the system will need to process 1 exabyte of data every 4 to 7 hours. This is a natural environment wherein machine learning (ML) methods offer a solution. 

Recent advances in accelerator technologies have opened up the possibility to deploy ML based algorithms directly onto the trigger firmware. I am researching ways to use these algorithms to directly improve the sensitivity of BSM Physics searches that heavily rely on good trigger efficiency (such as the search for invisible Higgs bosons), in the CMS trigger at the HL-LHC