Thanks to a supersensitive space telescope and some sophisticated supercomputing, scientists from the international Planck collaboration have made the closest reading yet of the most ancient story in our universe: the cosmic microwave background (CMB).
Today, the team released preliminary results based on the Planck observatory's first 15 months of data. Using supercomputers at the U.S. Department of Energy's (DOE) National Energy Research Scientific Computing Center (NERSC) Planck scientists have created the most detailed and accurate maps yet of the relic radiation from the big bang. They reveal that the universe is 100 million years older than we thought with more matter and less dark energy.
"These maps are proving to be a goldmine containing stunning confirmations and new puzzles," says Martin White, a Planck scientist and physicist with University of California Berkeley and at Lawrence Berkeley National Laboratory (Berkeley Lab). "This data will form the cornerstone of our cosmological model for decades to come and spur new directions in research."
Decoding the Cosmos
Written in light shortly after the big bang, the CMB is a faint glow that permeates the cosmos. Studying it can help us understand how our universe was born, its nature, composition and eventual fate. "Encoded in its fluctuations are the parameters of all cosmology, numbers that describe the universe in its entirety," says Julian Borrill, a Planck collaborator and cosmologist in the Computational Research Division at Berkeley Lab.
However, CMB surveys are complex and subtle undertakings. Even with the most sophisticated detectors, scientists still need supercomputing to sift the CMB's faint signal out of a noisy universe and decode its meaning.
Hundreds of scientists from around the world study the CMB using supercomputers at NERSC, a DOE user facility based at Berkeley Lab. "NERSC supports the entire international Planck effort," says Borrill. A co-founder of the Computational Cosmology Center (C3) at the lab, Borrill has been developing supercomputing tools for CMB experiments for over a decade. The Planck observatory, a mission of the European Space Agency with significant participation from NASA, is the most challenging yet.
Parked in an artificial orbit about 800,000 miles away from Earth, Planck's 72 detectors complete a full scan of the sky once every six months or so. Observing at nine different frequencies, Planck gathers about 10,000 samples every second, or a trillion samples in total for the 15 months of data included in this first release. In fact, Planck generates so much data that, unlike earlier CMB experiments, it's impossible to analyze exactly, even with NERSC's powerful supercomputers.
Instead, CMB scientists employ clever workarounds. Using approximate methods they are able to handle the Planck data volume, but then they need to understand the uncertainties and biases their approximations have left in the results.
One particularly challenging bias comes from the instrument itself. The position and orientation of the observatory in its orbit, the particular shapes and sizes of detectors (these vary) and even the overlap in Planck's scanning pattern affect the data.
To account for such biases and uncertainties, researchers generate a thousand synthetic (or simulated) copies of the Planck data and apply the same analysis to these. Measuring how the approximations affect this simulated data allows the Planck team to account for their impact on the real data.
With each generation of NERSC supercomputers, the Planck team has adapted its software to run on more and more processors, pushing the limits of successive systems while reducing the time it takes to run a greater number of complex calculations.
"By scaling up to tens of thousands of processors, we've reduced the time it takes to run these calculations from an impossible 1,000 years down to a few weeks," says Ted Kisner, a C3 member at Berkeley Lab and Planck scientist. In fact, the team's codes are so demanding that they're often called on to push the limits of new NERSC systems.
Access to the NERSC Global Filesystem and vast online and offline storage has also been key. "CMB data over the last 15 years have grown with Moore's Law, so we expect a two magnitude increase in data in the coming 15 years, too," says Borrill.
In 2007 NASA and DOE negotiated a formal interagency agreement that guaranteed Planck access to NERSC for the duration of its mission. "Without the exemplary interagency cooperation between NASA and DOE, Planck would not be doing the science it's doing today," says Charles Lawrence of NASA's Jet Propulsion Laboratory (JPL). A Planck project scientist, Lawrence leads the U.S. team for NASA.
NASA's Planck Project Office is based at JPL. JPL contributed mission-enabling technology for both of Planck's science instruments. European, Canadian and U.S. Planck scientists work together to analyze the Planck data. More information is online at http://www.nasa.gov/planck and http://www.esa.int/planck.
NERSC is supported by DOE's Office of Science.
About Berkeley Lab
Lawrence Berkeley National Laboratory addresses the world's most urgent scientific challenges by advancing sustainable energy, protecting human health, creating new materials, and revealing the origin and fate of the universe. Founded in 1931, Berkeley Lab's scientific expertise has been recognized with 13 Nobel prizes. The University of California manages Berkeley Lab for the U.S. Department of Energy's Office of Science. For more, visit http://www.lbl.gov.
About the DOE Office of Science
DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit the Office of Science website at science.energy.gov.
The National Energy Research Scientific Computing Center (NERSC) is the primary high-performance computing facility for scientific research sponsored by the U.S. Department of Energy's Office of Science. Located at Berkeley Lab, NERSC serves more than 4,000 scientists at national laboratories and universities across a full range of scientific disciplines. For more, visit http://www.nersc.gov.
About the Computational Cosmology Center (C3)
C3 is a focused collaboration of astrophysicists and computational scientists whose goals are to develop the tools, techniques and technologies to meet the analysis challenges posed by present and future cosmological data sets. For more, visit http://c3.lbl.gov.
Margie Wylie | EurekAlert!
Four elements make 2-D optical platform
26.09.2017 | Rice University
The material that obscures supermassive black holes
26.09.2017 | Instituto de Astrofísica de Canarias (IAC)
Controlling electronic current is essential to modern electronics, as data and signals are transferred by streams of electrons which are controlled at high speed. Demands on transmission speeds are also increasing as technology develops. Scientists from the Chair of Laser Physics and the Chair of Applied Physics at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) have succeeded in switching on a current with a desired direction in graphene using a single laser pulse within a femtosecond ¬¬ – a femtosecond corresponds to the millionth part of a billionth of a second. This is more than a thousand times faster compared to the most efficient transistors today.
Graphene is up to the job
At the productronica trade fair in Munich this November, the Fraunhofer Institute for Laser Technology ILT will be presenting Laser-Based Tape-Automated Bonding, LaserTAB for short. The experts from Aachen will be demonstrating how new battery cells and power electronics can be micro-welded more efficiently and precisely than ever before thanks to new optics and robot support.
Fraunhofer ILT from Aachen relies on a clever combination of robotics and a laser scanner with new optics as well as process monitoring, which it has developed...
Plants and algae use the enzyme Rubisco to fix carbon dioxide, removing it from the atmosphere and converting it into biomass. Algae have figured out a way to increase the efficiency of carbon fixation. They gather most of their Rubisco into a ball-shaped microcompartment called the pyrenoid, which they flood with a high local concentration of carbon dioxide. A team of scientists at Princeton University, the Carnegie Institution for Science, Stanford University and the Max Plank Institute of Biochemistry have unravelled the mysteries of how the pyrenoid is assembled. These insights can help to engineer crops that remove more carbon dioxide from the atmosphere while producing more food.
A warming planet
Our brains house extremely complex neuronal circuits, whose detailed structures are still largely unknown. This is especially true for the so-called cerebral cortex of mammals, where among other things vision, thoughts or spatial orientation are being computed. Here the rules by which nerve cells are connected to each other are only partly understood. A team of scientists around Moritz Helmstaedter at the Frankfiurt Max Planck Institute for Brain Research and Helene Schmidt (Humboldt University in Berlin) have now discovered a surprisingly precise nerve cell connectivity pattern in the part of the cerebral cortex that is responsible for orienting the individual animal or human in space.
The researchers report online in Nature (Schmidt et al., 2017. Axonal synapse sorting in medial entorhinal cortex, DOI: 10.1038/nature24005) that synapses in...
Whispering gallery mode (WGM) resonators are used to make tiny micro-lasers, sensors, switches, routers and other devices. These tiny structures rely on a...
19.09.2017 | Event News
12.09.2017 | Event News
06.09.2017 | Event News
26.09.2017 | Life Sciences
26.09.2017 | Physics and Astronomy
26.09.2017 | Information Technology