Andreas Hirstius, manager of CERN Openlab and the CERN School of Computing, explains in November’s Physics World how computer scientists have risen to the challenge of dealing with this unprecedented volume of data.
When CERN staff first considered how they might deal with the large volume of data that the huge collider would produce when its two beams of protons collide, in the mid-1990s, a single gigabyte of disk space still cost a few hundred dollars and CERN’s total external connectivity was equivalent to just one of today’s broadband connections.
It quickly became clear that computing power at CERN, even taking Moore’s Law into account, would be significantly less than that required to analyse LHC data. The solution, it transpired during the 1990s, was to turn to "high-throughput computing" where the focus is not on shifting data as quickly as possible from A to B but rather from shifting as much information as possible between those two points.
High-performance computing is ideal for particle physics because the data produced in the millions of proton-proton collisions are all independent of one another - and can therefore be handled independently. So, rather than using a massive all-in-one mainframe supercomputer to analyse the results, the data can be sent to separate computers, all connected via a network.
From here sprung the LHC Grid. The Grid, which was officially inaugurated last month, is a tiered structure centred on CERN (Tier-0), which is connected by superfast fibre links to 11 Tier-1 centres at places like the Rutherford Appleton Laboratory (RAL) in the UK and Fermilab in the US. More than one CD's worth of data (about 700 MB) can be sent down these fibres to each of the Tier-1 centres every second.
Tier 1 centres then feed down to another 250 regional Tier-2 centres that are in turn accessed by individual researchers through university computer clusters and desktops and laptops (Tier-3).
As Andreas Hirstius writes, “The LHC challenge presented to CERN’s computer scientists was as big as the challenges to its engineers and physicists. The computer scientists managed to develop a computing infrastructure that can handle huge amounts of data, thereby fulfilling all of the physicists’ requirements and in some cases even going beyond them.”
Also in this issue:
• President George W Bush’s science adviser, the physicist John H Marburger, asks whether Bush’s eight years in office have been good for science in the US.
• Brian Cox may be the media-friendly face of particle physics, but how does the former D:Ream pop star, now a Manchester University physics professor, find the time for both research and his outreach work?
• Beauty and the beast: in his 100th column for Physics World, Robert P Crease asks whether CERN’s Large Hadron Collider, the biggest experiment of all time, can be dubbed “beautiful”.
Joe Winters | alfa
Water without windows: Capturing water vapor inside an electron microscope
13.12.2017 | Okinawa Institute of Science and Technology (OIST) Graduate University
Columbia engineers create artificial graphene in a nanofabricated semiconductor structure
13.12.2017 | Columbia University School of Engineering and Applied Science
MPQ scientists achieve long storage times for photonic quantum bits which break the lower bound for direct teleportation in a global quantum network.
Concerning the development of quantum memories for the realization of global quantum networks, scientists of the Quantum Dynamics Division led by Professor...
Researchers have developed a water cloaking concept based on electromagnetic forces that could eliminate an object's wake, greatly reducing its drag while...
Tiny pores at a cell's entryway act as miniature bouncers, letting in some electrically charged atoms--ions--but blocking others. Operating as exquisitely sensitive filters, these "ion channels" play a critical role in biological functions such as muscle contraction and the firing of brain cells.
To rapidly transport the right ions through the cell membrane, the tiny channels rely on a complex interplay between the ions and surrounding molecules,...
The miniaturization of the current technology of storage media is hindered by fundamental limits of quantum mechanics. A new approach consists in using so-called spin-crossover molecules as the smallest possible storage unit. Similar to normal hard drives, these special molecules can save information via their magnetic state. A research team from Kiel University has now managed to successfully place a new class of spin-crossover molecules onto a surface and to improve the molecule’s storage capacity. The storage density of conventional hard drives could therefore theoretically be increased by more than one hundred fold. The study has been published in the scientific journal Nano Letters.
Over the past few years, the building blocks of storage media have gotten ever smaller. But further miniaturization of the current technology is hindered by...
With innovative experiments, researchers at the Helmholtz-Zentrums Geesthacht and the Technical University Hamburg unravel why tiny metallic structures are extremely strong
Light-weight and simultaneously strong – porous metallic nanomaterials promise interesting applications as, for instance, for future aeroplanes with enhanced...
11.12.2017 | Event News
08.12.2017 | Event News
07.12.2017 | Event News
13.12.2017 | Health and Medicine
13.12.2017 | Physics and Astronomy
13.12.2017 | Life Sciences