Researchers have set a new world record for data transfer, helping to usher in the next generation of high-speed network technology. At the SuperComputing 2011 (SC11) conference in Seattle during mid-November, the international team transferred data in opposite directions at a combined rate of 186 gigabits per second (Gbps) in a wide-area network circuit. The rate is equivalent to moving two million gigabytes per day, fast enough to transfer nearly 100,000 full Blu-ray disks—each with a complete movie and all the extras—in a day.
The team of high-energy physicists, computer scientists, and network engineers was led by the California Institute of Technology (Caltech), the University of Victoria, the University of Michigan, the European Center for Nuclear Research (CERN), Florida International University, and other partners.
According to the researchers, the achievement will help establish new ways to transport the increasingly large quantities of data that traverse continents and oceans via global networks of optical fibers. These new methods are needed for the next generation of network technology—which allows transfer rates of 40 and 100 Gbps—that will be built in the next couple of years.
"Our group and its partners are showing how massive amounts of data will be handled and transported in the future," says Harvey Newman, professor of physics and head of the high-energy physics (HEP) team. "Having these tools in our hands allows us to engage in realizable visions others do not have. We can see a clear path to a future others cannot yet imagine with any confidence."
Using a 100-Gbps circuit set up by Canada's Advanced Research and Innovation Network (CANARIE) and BCNET, a non-profit, shared IT services organization, the team was able to reach transfer rates of 98 Gbps between the University of Victoria Computing Centre located in Victoria, British Columbia, and the Washington State Convention Centre in Seattle. With a simultaneous data rate of 88 Gbps in the opposite direction, the team reached a sustained two-way data rate of 186 Gbps between two data centers, breaking the team's previous peak-rate record of 119 Gbps set in 2009.
In addition, partners from the University of Florida, the University of California at San Diego, Vanderbilt University, Brazil (Rio de Janeiro State University and the São Paulo State University), and Korea (Kyungpook National University and the Korean Institute for Science and Technology Information) helped with a larger demonstration, transferring massive amounts of data between the Caltech booth at the SC11 conference and other locations within the United States, as well as in Brazil and Korea.
The fast transfer rate is also crucial for dealing with the tremendous amounts of data coming from the Large Hadron Collider (LHC) at CERN, the particle accelerator that physicists hope will help them discover new particles and better understand the nature of matter, and space and time, solving some of the biggest mysteries of the universe. More than 100 petabytes (more than four million Blu-ray disks) of data have been processed, distributed, and analyzed using a global grid of 300 computing and storage facilities located at laboratories and universities around the world, and the data volume is expected to rise a thousand-fold as physicists crank up the collision rates and energies at the LHC.
"Enabling scientists anywhere in the world to work on the LHC data is a key objective, bringing the best minds together to work on the mysteries of the universe," says David Foster, the deputy IT department head at CERN.
"The 100-Gbps demonstration at SC11 is pushing the limits of network technology by showing that it is possible to transfer petascale particle physics data in a matter of hours to anywhere around the world," adds Randall Sobie, a research scientist at the Institute of Particle Physics in Canada and team member.
The key to discovery, the researchers say, is in picking out the rare signals that may indicate new physics discoveries from a sea of potentially overwhelming background noise caused by already understood particle interactions. To do this, individual physicists and small groups located around the world must repeatedly access—and sometimes extract and transport—multiterabyte data sets on demand from petabyte data stores. That's equivalent to grabbing hundreds of Blu-ray movies all at once from a pool of hundreds of thousands. The HEP team hopes that the demonstrations at SC11 will pave the way towards more effective distribution and use for discoveries of the masses of LHC data.
"By sharing our methods and tools with scientists in many fields, we hope that the research community will be well positioned to further enable their discoveries, taking full advantage of 100 Gbps networks as they become available," Newman says. "In particular, we hope that these developments will afford physicists and young students the opportunity to participate directly in the LHC's next round of discoveries as they emerge."
More information about the demonstration can be found at http://supercomputing.caltech.edu. See a video about the demonstration here.
This work was supported by the U.S. Department of Energy Office of Science and the National Science Foundation, in cooperation with the funding agencies of the international partners. Equipment and support was also provided by the team's industry partners: CIENA, Brocade, Mellanox, Dell and Force10 (now Dell/Force10), and Supermicro.Contact:
Sonia Chernobieff | EurekAlert!
New Boost for ToCoTronics
23.05.2019 | Julius-Maximilians-Universität Würzburg
The geometry of an electron determined for the first time
23.05.2019 | Universität Basel
Physicists at the University of Basel are able to show for the first time how a single electron looks in an artificial atom. A newly developed method enables them to show the probability of an electron being present in a space. This allows improved control of electron spins, which could serve as the smallest information unit in a future quantum computer. The experiments were published in Physical Review Letters and the related theory in Physical Review B.
The spin of an electron is a promising candidate for use as the smallest information unit (qubit) of a quantum computer. Controlling and switching this spin or...
Engineers at the University of Tokyo continually pioneer new ways to improve battery technology. Professor Atsuo Yamada and his team recently developed a...
With a quantum coprocessor in the cloud, physicists from Innsbruck, Austria, open the door to the simulation of previously unsolvable problems in chemistry, materials research or high-energy physics. The research groups led by Rainer Blatt and Peter Zoller report in the journal Nature how they simulated particle physics phenomena on 20 quantum bits and how the quantum simulator self-verified the result for the first time.
Many scientists are currently working on investigating how quantum advantage can be exploited on hardware already available today. Three years ago, physicists...
'Quantum technologies' utilise the unique phenomena of quantum superposition and entanglement to encode and process information, with potentially profound benefits to a wide range of information technologies from communications to sensing and computing.
However a major challenge in developing these technologies is that the quantum phenomena are very fragile, and only a handful of physical systems have been...
Working group led by physicist Professor Ulrich Nowak at the University of Konstanz, in collaboration with a team of physicists from Johannes Gutenberg University Mainz, demonstrates how skyrmions can be used for the computer concepts of the future
When it comes to performing a calculation destined to arrive at an exact result, humans are hopelessly inferior to the computer. In other areas, humans are...
29.04.2019 | Event News
17.04.2019 | Event News
15.04.2019 | Event News
23.05.2019 | Materials Sciences
23.05.2019 | Materials Sciences
23.05.2019 | Physics and Astronomy