The CERN Large Hadron Collider, the LHC, is the world’s highest-energy particle accelerator. It smashes together protons with energies almost 7,000 times their intrinsic energy at rest to explore nature at distances as small as 1 part in 100,000 of the size of an atomic nucleus. These large energies and small distances hold clues to fundamental mysteries about the origin and nature of the elementary particles that make up matter.
The LHC is a high-performance machine, a Formula 1 race car, not a Toyota. As such, it needs to spend time in the shop. The previous run of the LHC ended in December 2018. Since then, scientists and technicians have installed numerous fixes and improvements to both the accelerator and the particle detectors. In mid-April, the LHC began a series of final tests and tunings, raising the collision energy from 13 TeV to 13.6 TeV, moving closer to the design energy of 14 TeV. On 5 July the new run of data-taking began. The new Run 3, planned to end in late 2025, is expected to double the current LHC data set.
Run 3 is an intermediate stage in the LHC program. Run 1 began in 2010. Its major highlight was the discovery of the long-anticipated Higgs boson in July 2012. In 2013, the accelerator shut down for an earlier period of repair and upgrade. Run 2 began two years later, in the spring of 2015. Its primary achievements were the discoveries of the major decay modes of the Higgs boson, verifying that this particle is indeed the origin of mass, at least for all of the relatively heavy known elementary particles. CERN plans that Run 3 will be followed by a longer preparation period extending through 2029. Then the fully-grown LHC, now called HL-LHC, will burst into action again with a rate of collisions 10 times larger than the current one. Running until 2042, it will gather its ultimate data set, 10 times larger than that expected at the end of Run 3.
“[The LHC holds] clues to fundamental mysteries about the origin and nature of the elementary particles that make up matter.”
The data challenge
The key to LHC physics is accumulation of a huge number of records of proton-proton collisions for analysis. The proton is the easiest particle to manipulate and thus the particle of choice for acceleration to the highest energies. But it is not an elementary particle. It is a bound state of quarks, held together by particles called gluons that are the quanta of the strong nuclear force.
To describe collisions at the LHC, it is useful to think of the proton as a bag of jelly beans, holding quarks and gluons and also, by quantum processes, anti-quarks, quarks of exotic flavor—such as the bottom quark—and even heavier particles such as the W and Z bosons that are the quanta of the weak interactions responsible for radioactive decay. When two protons collide, the most likely thing to happen is that the two “bags” are ripped apart, spilling out particles that re-form into protons, pions, kaons, and other more familiar nuclear particles. But, occasionally, two quarks or gluons will collide head-on, compressing all of their energy to a tiny spot and then releasing it back to quarks and gluons or perhaps to heavier elementary particles, known and unknown.
By studying these rare reactions with tremendous energy release, physicists can glimpse the laws of nature at very short distances. As the LHC accumulates data, the experiments will build up larger and larger samples of these rare reactions, eventually accumulating enough events for strong evidence of a discovery.
Finding these occasional hard collisions is a tremendous data challenge. Bunches of protons at the LHC collide 40 million times per second. Each bunch collision leads to 50 or more individual proton-proton collisions. The photographic records of these collisions taken by the major LHC detectors ATLAS and CMS must be written into permanent storage. The size of each picture is already 20 times larger than a typical smart-phone photo, and so keeping everything for one second of operation would already produce a million-GigaByte database. But, in each second of data-taking, the 40 million events are mostly simple and familiar ones, with only a few thousand W bosons events and only one Higgs boson event buried in the stream. Thus, a crucial part of each LHC experiment is the “trigger,” a bank of computer processors that selects a few hundred per second of these collisions for the permanent record that physicists will analyze. Even with such a severe selection, the LHC experiments already create one of the world’s largest computer databases.
“Even with such a severe [event] selection [process], the LHC experiments already create one of the world’s largest computer databases.”
The LHC events rush out at tremendous speed, and this creates a problem that events must be selected at a rate too fast for human intervention. The trigger has two stages. Level 1 must select 1 in 100 events in 100 microseconds and throw away the rest. Then the high-level trigger can take the luxury of a whole second to make a more sophisticated decision—but it must still throw away all but 1 in 10,000 of the events it receives. If an event does not make it into the final event record, it is as if it never happened.
By LHC standards, it is not so difficult to find the events with very large energy transfer from the initial quarks and gluons. What is more difficult is to find less prominent events that are special in another way, perhaps containing hints of new weakly-coupled interactions or particles of the cosmic dark matter. The most important upgrades to the major detectors for Run 3 are improvements to the trigger, including new detector elements and re-wiring of the existing elements to bring more information to Level 1. The CMS experiment will add special-purpose processors running machine-learning algorithms at blinding speed to make the Level 1 decisions.
The new particle targets
The primary goal of the LHC now is to discover new elementary particles that might give evidence for new, still-undiscovered, fundamental interactions. Some of the new particles proposed would be heavy and would decay to clusters of quarks and leptons displaying very high energy. My personal favorite for eventual discovery at the LHC is a new heavy quark, a partner to the top quark. Unfortunately, it is quite unlikely that such a particle can be discovered in Run 3. The current searches already exclude these particles up to a mass about 10 times that of the Higgs boson. A top quark partner at a slightly higher mass—which could well be there—would not appear in enough events for an unambiguous discovery. At best, the experiments would give interesting statistical hints, and even some suggestive event pictures with novel features. That will get theorists talking. The HL-LHC, in Run 4 and beyond, would be needed to confirm these suggestions.
However, there is a real opportunity in searches for weakly-coupled new particles, such as those predicted in models of the dark matter. Reactions that produce such particles have low rates, since they are not produced by the strong interactions but rather by the electromagnetic and weak interactions. Thus any increase in the data set would be helpful. The dark matter particle interacts too weakly to leave a signal in the LHC detectors. This is not a problem in itself because one can look for visible particles recoiling against the invisible emissions, in accord with Newton’s third law. But, in many models, the partners of the dark matter particle release very little visible energy, leading to very small recoil signals that cannot be recognized by the experiments’ triggers. The trigger improvements in Run 3 will improve the coverage for such subtle signals, and the increased rate will produce a sample of rarer events in which the recoiling particles are pushed out into easier view. With these improvements, the ability of the ATLAS and CMS to recognize these signals will measurably increase.
“The primary goal of the LHC now is to discover new elementary particles that might give evidence for new, still-undiscovered, fundamental interactions.”
Machine learning
Though most searches for new particles target particles predicted by theorists, there is an increasing trend to search for new particles that theorists have not yet imagined. It is hopeless to ask humans to search through the whole library of LHC event pictures in hopes of discovering anomalous ones. But this could be possible with advanced “deep learning” computer algorithms that use artificial intelligence to scan through mounds of data.
Particle physicists have very challenging problems of signal classification, and so they entered early into the creation of machine-learning tools. Already in 1980, the SLAC Mark II experiment was using trainable decision trees to separate signals of gamma rays and pi mesons. Today, machine-learning is used in almost every decision made by the LHC detectors, for example, the identification of bottom quarks in a sample of quark signals. These tools, though, are trained on simulation data. Simulation programs for LHC events, such as the commonly used program PYTHIA, are remarkably accurate, but still they cannot model all relevant physics. If an anomaly is detected by a machine-learning algorithm, we must ask, is this truly new physics or just a defect in PYTHIA?
In Run 2, a number of algorithms were proposed for “minimally supervised classification.” For example, one might compare two data samples that would be expected to have different proportions of a reaction containing a new particle. With even this minimal hint, these algorithms can identify events in one sample that are anomalous with respect to the other. It will be very interesting to run these algorithms blindly on the new data set that becomes available in Run 3. It is quite possible that they will turn up completely unanticipated signals.
Electrons versus muons
So far, I have discussed only the general-purpose LHC detectors ATLAS and CMS. But the LHC also hosts a specialized detector LHCb, dedicated to studying the large samples of particles containing the bottom and charm quarks that are produced in LHC reactions. The LHCb detector has different goals from ATLAS and CMS and so it is not built to accept the very high data rate of those detectors. But, during the shutdown, the LHCb data-taking system was completely rebuilt to take 10 times more data in Run 3 than in the previous runs.
In contrast to ATLAS and CMS, the specialized study of B mesons by LHCb has turned up some tantalizing anomalies. The most prominent of these is the observation of different rates for the decays of the B meson to a K meson plus a muon-antimuon pair versus decays to a K meson plus an electron-positron pair. In our current understanding of particle physics, the muon is a heavy version of the electron. Its mass is 200 times larger than that of the electron, but its interactions are precisely identical. This muon-electron universality has been tested in many settings. In contrast to this body of evidence, the LHCb experiment reports 20% fewer events of these B meson decays to muons as opposed to electrons. Because muons and electrons are observed rather differently in the LHCb detector, the result is not statistically definitive, but still it is striking. Could new interactions, involving heavy quarks, differentiate the muon and the electron and play a role in explaining their very different masses? The observation moves muon-electron universality tests even at ATLAS and CMS from routine projects to the center of attention. Can the highest-energy reactions show corroborating evidence, or will they support the standard universality? We will certainly learn more in Run 3.
Like all Big Science projects, the LHC advances slowly but gives access to deeply buried knowledge that cannot be obtained in any other way. Run 3 is the next stage in this progress. Please watch for the surprises it will bring.
Featured image: “View of the LHC tunnel sector 3-4” by CERN, via Wikimedia Commons (CC BY-SA 3.0)
The new elementary particles would play a major role in energy production l guess