Thanks to a supersensitive space telescope and some sophisticated supercomputing, scientists from the international Planck collaboration have made the closest reading yet of the most ancient story in our universe: the cosmic microwave background (CMB).
Today, the team released preliminary results based on the Planck observatory's first 15 months of data. Using supercomputers at the U.S. Department of Energy's (DOE) National Energy Research Scientific Computing Center (NERSC) Planck scientists have created the most detailed and accurate maps yet of the relic radiation from the big bang. They reveal that the universe is 100 million years older than we thought with more matter and less dark energy.
"These maps are proving to be a goldmine containing stunning confirmations and new puzzles," says Martin White, a Planck scientist and physicist with University of California Berkeley and at Lawrence Berkeley National Laboratory (Berkeley Lab). "This data will form the cornerstone of our cosmological model for decades to come and spur new directions in research."
Decoding the Cosmos
Written in light shortly after the big bang, the CMB is a faint glow that permeates the cosmos. Studying it can help us understand how our universe was born, its nature, composition and eventual fate. "Encoded in its fluctuations are the parameters of all cosmology, numbers that describe the universe in its entirety," says Julian Borrill, a Planck collaborator and cosmologist in the Computational Research Division at Berkeley Lab.
However, CMB surveys are complex and subtle undertakings. Even with the most sophisticated detectors, scientists still need supercomputing to sift the CMB's faint signal out of a noisy universe and decode its meaning.
Hundreds of scientists from around the world study the CMB using supercomputers at NERSC, a DOE user facility based at Berkeley Lab. "NERSC supports the entire international Planck effort," says Borrill. A co-founder of the Computational Cosmology Center (C3) at the lab, Borrill has been developing supercomputing tools for CMB experiments for over a decade. The Planck observatory, a mission of the European Space Agency with significant participation from NASA, is the most challenging yet.
Parked in an artificial orbit about 800,000 miles away from Earth, Planck's 72 detectors complete a full scan of the sky once every six months or so. Observing at nine different frequencies, Planck gathers about 10,000 samples every second, or a trillion samples in total for the 15 months of data included in this first release. In fact, Planck generates so much data that, unlike earlier CMB experiments, it's impossible to analyze exactly, even with NERSC's powerful supercomputers.
Instead, CMB scientists employ clever workarounds. Using approximate methods they are able to handle the Planck data volume, but then they need to understand the uncertainties and biases their approximations have left in the results.
One particularly challenging bias comes from the instrument itself. The position and orientation of the observatory in its orbit, the particular shapes and sizes of detectors (these vary) and even the overlap in Planck's scanning pattern affect the data.
To account for such biases and uncertainties, researchers generate a thousand synthetic (or simulated) copies of the Planck data and apply the same analysis to these. Measuring how the approximations affect this simulated data allows the Planck team to account for their impact on the real data.
With each generation of NERSC supercomputers, the Planck team has adapted its software to run on more and more processors, pushing the limits of successive systems while reducing the time it takes to run a greater number of complex calculations.
"By scaling up to tens of thousands of processors, we've reduced the time it takes to run these calculations from an impossible 1,000 years down to a few weeks," says Ted Kisner, a C3 member at Berkeley Lab and Planck scientist. In fact, the team's codes are so demanding that they're often called on to push the limits of new NERSC systems.
Access to the NERSC Global Filesystem and vast online and offline storage has also been key. "CMB data over the last 15 years have grown with Moore's Law, so we expect a two magnitude increase in data in the coming 15 years, too," says Borrill.
In 2007 NASA and DOE negotiated a formal interagency agreement that guaranteed Planck access to NERSC for the duration of its mission. "Without the exemplary interagency cooperation between NASA and DOE, Planck would not be doing the science it's doing today," says Charles Lawrence of NASA's Jet Propulsion Laboratory (JPL). A Planck project scientist, Lawrence leads the U.S. team for NASA.
NASA's Planck Project Office is based at JPL. JPL contributed mission-enabling technology for both of Planck's science instruments. European, Canadian and U.S. Planck scientists work together to analyze the Planck data. More information is online at http://www.nasa.gov/planck and http://www.esa.int/planck.
NERSC is supported by DOE's Office of Science.
About Berkeley Lab
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy's Office of Science. For more, visit http://www.lbl.gov.
About the DOE Office of Science
DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit the Office of Science website at science.energy.gov.
The National Energy Research Scientific Computing Center (NERSC) is the primary high-performance computing facility for scientific research sponsored by the U.S. Department of Energy's Office of Science. Located at Berkeley Lab, NERSC serves more than 4,000 scientists at national laboratories and universities across a full range of scientific disciplines. For more, visit http://www.nersc.gov.
About the Computational Cosmology Center (C3)
C3 is a focused collaboration of astrophysicists and computational scientists whose goals are to develop the tools, techniques and technologies to meet the analysis challenges posed by present and future cosmological data sets. For more, visit http://c3.lbl.gov.
Margie Wylie | EurekAlert!
Taking a spin on plasma space tornadoes with NASA observations
20.11.2017 | NASA/Goddard Space Flight Center
NASA detects solar flare pulses at Sun and Earth
17.11.2017 | NASA/Goddard Space Flight Center
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
20.11.2017 | Earth Sciences
20.11.2017 | Earth Sciences
20.11.2017 | Life Sciences