Operations at the LHC recently restarted following an upgrade, and it can now collide particles at nearly twice the energy that was used to discover the Higgs boson, the long-hypothesized particle that gives all others mass.
The new higher energy and increased data will hopefully allow scientists to probe the properties of the Higgs boson, as well as potentially discover new elementary particles. However, with high energies creating up to one billion proton collisions per second, the challenge is identifying which collisions that may have produced interesting particles, and singling them out for further study.
Dr Alex Tapper from the Department of Physics at Imperial College London, who led the project in collaboration with the University of Bristol, said: “When data is created so quickly and in such large amounts, you only have about a millionth of a second to decide if each collision is interesting enough to be stored and analysed. A normal PC can’t work that fast – so we developed our own computer to do it.”
The computer, known as a ‘trigger system’, has been incorporated into the Compact Muon Solenoid (CMS) experiment – one of the major particle detectors at the collider/” title=”View all articles about Large Hadron Collider here”>Large Hadron Collider.
The upgraded general purpose CMS detector pulls together all the data produced in the microsecond after each particle collision. The new system is designed to be flexible so that it can be ready for when the experiment runs with even more particles in the future.
Dr Andrew Rose from the Department of Physics at Imperial, who created and installed the hardware with engineers at CERN, said: “At the higher energies and particle densities used in the updated collider, CMS will be able to search for rarer events, and the trigger system is vital to pick out those interesting events from the huge number of uninteresting ones.
“The upgraded system has been five years in the making and is incredibly powerful at processing data: our system is the size of a microwave oven, yet processes the same volume of data as the entire internet in 2007. It’ll be really exciting to see what it finds!”
The basic system can also be adapted to other scientific experiments that handle large quantities of data. Similar systems designed by the team are due to be installed in experiments including the COMET experiment in Japan, which studies the properties of subatomic particles called muons, and the SOLiD experiment in Belgium, which investigates neutrinos.