Tackling the big data challenge of particle physics experiments

A team of UK scientists are developing software to utilise big data sets collected from particle physics experiments, particularly those at the Large Hadron Collider.

Over recent years, the output level from large-scale physics experiments has been rising, and current software has been struggling to meet the demand of crunching all the collected data. Therefore, advancements in computer science are vital for the future of particle physics experiments.

For the first time a team of UK researchers have been given funding to develop a software-based project by the Science and Technology Facilities Council (STFC).

Physicists at the University of Warwick and Monash University arecollaborating on the development of the software, as part of the Monash-Warwick Alliance, alongside scientists from the Universities of Sheffield and Manchester.

The new software will be able to crunch the big data sets that will be produced this decade at the Large Hadron Collider (LHC) at CERN and the high energy physics neutrino experiments such as DUNE and Hyper-Kamiokande.

Dr Ben Morgan from the University of Warwick Department of Physics said: “Software and computing are critical tools for scientific research with the ever-increasing amount of data that experiments are recording, especially so in high energy physics to maximise the measurements we can make and the discoveries these may lead to. Our work at the University of Warwick in cooperation with Monash University, CERN, and other stakeholders on reducing the CPU time required to simulate particle collisions and how our detectors record them will address one of the most complex and computationally expensive areas for HEP.

“Increasing the efficiency of software used in these areas will allow us to maximise the science we can do whilst reducing our resource footprint, for HEP as well as fields and industries that utilise these software packages for their research such as medical and space physics.”

Professor Davide Costanzo, the Principal Investigator (PI) based at the University of Sheffield, added: “Modern particle physics experiments are capable of producing an exabyte of real and simulated data every year. Modern software tools are needed to process the data and produce the physics results and discoveries that we, particle physicists, are so proud of. This is central to the exploitation of particle physics experiments in the decades to come.”

To put it into context, storing one exabyte of data would require nearly one million powerful home computers. If new software was not developed to collect and crunch the data produced by the LHC, the computing resources needed would increase by a projected six times in the next decade.

The research team will develop more efficient software that can reduce the usage of computing resources, reduce hardware costs and use less electricity, while making sure the capabilities of particle physics experiments are utilised.

Conor Fitzpatrick, the Deputy PI and UKRI Future Leaders Fellow based at the University of Manchester, said: “The data rates we expect from the LHC upgrades and future experimental infrastructure represent an enormous challenge: if we are to maximally exploit these state-of-the art machines, we need to develop new and cost-effective ways to collect and process the data they will generate.

Subscribe to our newsletter

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Featured Topics

Partner News

Advertisements

Media Partners

Similar Articles

More from Innovation News Network