Andreas Hirstius, manager of CERN Openlab and the CERN School of Computing, explains in November’s Physics World how computer scientists have risen to the challenge of dealing with this unprecedented volume of data.
When CERN staff first considered how they might deal with the large volume of data that the huge collider would produce when its two beams of protons collide, in the mid-1990s, a single gigabyte of disk space still cost a few hundred dollars and CERN’s total external connectivity was equivalent to just one of today’s broadband connections.
It quickly became clear that computing power at CERN, even taking Moore’s Law into account, would be significantly less than that required to analyse LHC data. The solution, it transpired during the 1990s, was to turn to "high-throughput computing" where the focus is not on shifting data as quickly as possible from A to B but rather from shifting as much information as possible between those two points.
High-performance computing is ideal for particle physics because the data produced in the millions of proton-proton collisions are all independent of one another - and can therefore be handled independently. So, rather than using a massive all-in-one mainframe supercomputer to analyse the results, the data can be sent to separate computers, all connected via a network.
From here sprung the LHC Grid. The Grid, which was officially inaugurated last month, is a tiered structure centred on CERN (Tier-0), which is connected by superfast fibre links to 11 Tier-1 centres at places like the Rutherford Appleton Laboratory (RAL) in the UK and Fermilab in the US. More than one CD's worth of data (about 700 MB) can be sent down these fibres to each of the Tier-1 centres every second.
Tier 1 centres then feed down to another 250 regional Tier-2 centres that are in turn accessed by individual researchers through university computer clusters and desktops and laptops (Tier-3).
As Andreas Hirstius writes, “The LHC challenge presented to CERN’s computer scientists was as big as the challenges to its engineers and physicists. The computer scientists managed to develop a computing infrastructure that can handle huge amounts of data, thereby fulfilling all of the physicists’ requirements and in some cases even going beyond them.”
Also in this issue:
• President George W Bush’s science adviser, the physicist John H Marburger, asks whether Bush’s eight years in office have been good for science in the US.
• Brian Cox may be the media-friendly face of particle physics, but how does the former D:Ream pop star, now a Manchester University physics professor, find the time for both research and his outreach work?
• Beauty and the beast: in his 100th column for Physics World, Robert P Crease asks whether CERN’s Large Hadron Collider, the biggest experiment of all time, can be dubbed “beautiful”.
Joe Winters | alfa
Move over, lasers: Scientists can now create holograms from neutrons, too
21.10.2016 | National Institute of Standards and Technology (NIST)
Finding the lightest superdeformed triaxial atomic nucleus
20.10.2016 | The Henryk Niewodniczanski Institute of Nuclear Physics Polish Academy of Sciences
Researchers from the Institute for Quantum Computing (IQC) at the University of Waterloo led the development of a new extensible wiring technique capable of controlling superconducting quantum bits, representing a significant step towards to the realization of a scalable quantum computer.
"The quantum socket is a wiring method that uses three-dimensional wires based on spring-loaded pins to address individual qubits," said Jeremy Béjanin, a PhD...
In a paper in Scientific Reports, a research team at Worcester Polytechnic Institute describes a novel light-activated phenomenon that could become the basis for applications as diverse as microscopic robotic grippers and more efficient solar cells.
A research team at Worcester Polytechnic Institute (WPI) has developed a revolutionary, light-activated semiconductor nanocomposite material that can be used...
By forcefully embedding two silicon atoms in a diamond matrix, Sandia researchers have demonstrated for the first time on a single chip all the components needed to create a quantum bridge to link quantum computers together.
"People have already built small quantum computers," says Sandia researcher Ryan Camacho. "Maybe the first useful one won't be a single giant quantum computer...
COMPAMED has become the leading international marketplace for suppliers of medical manufacturing. The trade fair, which takes place every November and is co-located to MEDICA in Dusseldorf, has been steadily growing over the past years and shows that medical technology remains a rapidly growing market.
In 2016, the joint pavilion by the IVAM Microtechnology Network, the Product Market “High-tech for Medical Devices”, will be located in Hall 8a again and will...
'Ferroelectric' materials can switch between different states of electrical polarization in response to an external electric field. This flexibility means they show promise for many applications, for example in electronic devices and computer memory. Current ferroelectric materials are highly valued for their thermal and chemical stability and rapid electro-mechanical responses, but creating a material that is scalable down to the tiny sizes needed for technologies like silicon-based semiconductors (Si-based CMOS) has proven challenging.
Now, Hiroshi Funakubo and co-workers at the Tokyo Institute of Technology, in collaboration with researchers across Japan, have conducted experiments to...
14.10.2016 | Event News
14.10.2016 | Event News
12.10.2016 | Event News
21.10.2016 | Health and Medicine
21.10.2016 | Information Technology
21.10.2016 | Materials Sciences