Physicists suggest a smaller secondary inflationary period in the moments after the Big Bang could account for the abundance of the mysterious matter
Standard cosmology -- that is, the Big Bang Theory with its early period of exponential growth known as inflation -- is the prevailing scientific model for our universe, in which the entirety of space and time ballooned out from a very hot, very dense point into a homogeneous and ever-expanding vastness. This theory accounts for many of the physical phenomena we observe. But what if that's not all there was to it?
A new theory from physicists at the U.S. Department of Energy's Brookhaven National Laboratory, Fermi National Accelerator Laboratory, and Stony Brook University, which will publish online on January 18 in Physical Review Letters, suggests a shorter secondary inflationary period that could account for the amount of dark matter estimated to exist throughout the cosmos.
"In general, a fundamental theory of nature can explain certain phenomena, but it may not always end up giving you the right amount of dark matter," said Hooman Davoudiasl, group leader in the High-Energy Theory Group at Brookhaven National Laboratory and an author on the paper. "If you come up with too little dark matter, you can suggest another source, but having too much is a problem."
Measuring the amount of dark matter in the universe is no easy task. It is dark after all, so it doesn't interact in any significant way with ordinary matter. Nonetheless, gravitational effects of dark matter give scientists a good idea of how much of it is out there. The best estimates indicate that it makes up about a quarter of the mass-energy budget of the universe, while ordinary matter -- which makes up the stars, our planet, and us -- comprises just 5 percent. Dark matter is the dominant form of substance in the universe, which leads physicists to devise theories and experiments to explore its properties and understand how it originated.
Some theories that elegantly explain perplexing oddities in physics -- for example, the inordinate weakness of gravity compared to other fundamental interactions such as the electromagnetic, strong nuclear, and weak nuclear forces -- cannot be fully accepted because they predict more dark matter than empirical observations can support.
This new theory solves that problem. Davoudiasl and his colleagues add a step to the commonly accepted events at the inception of space and time.
In standard cosmology, the exponential expansion of the universe called cosmic inflation began perhaps as early as 10-35 seconds after the beginning of time -- that's a decimal point followed by 34 zeros before a 1. This explosive expansion of the entirety of space lasted mere fractions of a fraction of a second, eventually leading to a hot universe, followed by a cooling period that has continued until the present day. Then, when the universe was just seconds to minutes old -- that is, cool enough -- the formation of the lighter elements began. Between those milestones, there may have been other inflationary interludes, said Davoudiasl.
"They wouldn't have been as grand or as violent as the initial one, but they could account for a dilution of dark matter," he said.
In the beginning, when temperatures soared past billions of degrees in a relatively small volume of space, dark matter particles could run into each other and annihilate upon contact, transferring their energy into standard constituents of matter-particles like electrons and quarks. But as the universe continued to expand and cool, dark matter particles encountered one another far less often, and the annihilation rate couldn't keep up with the expansion rate.
"At this point, the abundance of dark matter is now baked in the cake," said Davoudiasl. "Remember, dark matter interacts very weakly. So, a significant annihilation rate cannot persist at lower temperatures. Self-annihilation of dark matter becomes inefficient quite early, and the amount of dark matter particles is frozen."
However, the weaker the dark matter interactions, that is, the less efficient the annihilation, the higher the final abundance of dark matter particles would be. As experiments place ever more stringent constraints on the strength of dark matter interactions, there are some current theories that end up overestimating the quantity of dark matter in the universe. To bring theory into alignment with observations, Davoudiasl and his colleagues suggest that another inflationary period took place, powered by interactions in a "hidden sector" of physics. This second, milder, period of inflation, characterized by a rapid increase in volume, would dilute primordial particle abundances, potentially leaving the universe with the density of dark matter we observe today.
"It's definitely not the standard cosmology, but you have to accept that the universe may not be governed by things in the standard way that we thought," he said. "But we didn't need to construct something complicated. We show how a simple model can achieve this short amount of inflation in the early universe and account for the amount of dark matter we believe is out there."
Proving the theory is another thing entirely. Davoudiasl said there may be a way to look for at least the very feeblest of interactions between the hidden sector and ordinary matter.
"If this secondary inflationary period happened, it could be characterized by energies within the reach of experiments at accelerators such as the Relativistic Heavy Ion Collider (RHIC) and the Large Hadron Collider," he said. Only time will tell if signs of a hidden sector show up in collisions within these colliders, or in other experimental facilities.
Brookhaven National Laboratory is supported by the Office of Science of the U.S. Department of Energy. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit science.energy.gov.
One of ten national laboratories overseen and primarily funded by the Office of Science of the U.S. Department of Energy (DOE), Brookhaven National Laboratory conducts research in the physical, biomedical, and environmental sciences, as well as in energy technologies and national security. Brookhaven Lab also builds and operates major scientific facilities available to university, industry and government researchers. Brookhaven is operated and managed for DOE's Office of Science by Brookhaven Science Associates, a limited-liability company founded by the Research Foundation for the State University of New York on behalf of Stony Brook University, the largest academic user of Laboratory facilities, and Battelle, a nonprofit applied science and technology organization.
Chelsea Whyte | EurekAlert!
New NASA study improves search for habitable worlds
20.10.2017 | NASA/Goddard Space Flight Center
Physics boosts artificial intelligence methods
19.10.2017 | California Institute of Technology
University of Maryland researchers contribute to historic detection of gravitational waves and light created by event
On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...
Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.
Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....
Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).
When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...
Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.
How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...
Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.
It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...
17.10.2017 | Event News
10.10.2017 | Event News
10.10.2017 | Event News
20.10.2017 | Information Technology
20.10.2017 | Materials Sciences
20.10.2017 | Interdisciplinary Research