Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Scientists begin modeling universe with Einstein's full theory of general relativity

27.06.2016

Researchers ditch approximations, find small-scale structures produce important effects using new computer codes

Research teams on both sides of the Atlantic have shown that precise modeling of the universe and its contents will change the detailed understanding of the evolution of the universe and the growth of structure in it.


In a simulation of the universe without commonly made simplifications, galaxy profiles float atop a grid representing the spacetime background shaped by the distribution of matter. Regions of blue color contain more matter, which generates a deeper gravitational potential. Regions devoid of matter, darker in color, have a shallower potential.

Credit: James Mertens

One hundred years after Einstein introduced general relativity, it remains the best theory of gravity, the researchers say, consistently passing high-precision tests in the solar system and successfully predicting new phenomena such as gravitational waves, which were recently discovered by the Laser Interferometer Gravitational-Wave Observatory.

The equations of general relativity, unfortunately, are notoriously difficult to solve. For the past century, physicists have used a variety of assumptions and simplifications in order to apply Einstein's theory to the universe.

On Earth, that's something like averaging the music made by a symphony. The audience would hear a single average note, keeping the overall beat, growing generally louder and softer rather than the individual notes and rhythms of each of the orchestra's instruments.

Wanting details and their effects, U.S. and European teams each wrote computer codes that will eventually lead to the most accurate possible models of the universe and provide new insights into gravity and its effects.

While simulations of the universe and the structures within it have been the subject of scientific discovery for decades, these codes have made some simplifications or assumptions. These two codes are the first to use Einstein's complete theory of general relativity to account for the effects of the clumping of matter in some regions and the dearth of matter in others.

Both groups of physicists were trying to answer the question of whether small-scale structures in the universe produce effects on larger distance scales. Both confirmed that's the case, though neither has found qualitative changes in the expansion of the universe as some scientists have predicted.

"Both we and the other group examine the universe using the full theory of general relativity, and have therefore been able to create more accurate models of physical processes than have been done before," said James Mertens, a physics PhD student at Case Western Reserve University who took the lead in developing and implementing the numerical techniques for the U.S. team.

Mertens worked with John T. Giblin Jr., the Harvey F. Lodish Development Professor of Natural Science at Kenyon College and an adjunct associate professor of physics at Case Western Reserve; and Glenn Starkman, professor of physics and director of the Institute for the Science of Origins at Case Western Reserve. They submitted two manuscripts describing their work to the arXiv preprint website on Nov. 3, 2015.

Less than two weeks later, Marco Bruni, reader in cosmology and gravitation at the University of Portsmouth, in England, and Eloisa Bentivegna, Senior Researcher and Rita Levi Montalcini Fellow at the University of Catania, Italy, submitted a similar study.

Letters by the two groups appear back-to-back in the June 24th issue of Physical Review Letters, and the U.S. group has a second paper giving more of the details in the issue of The Physical Review Part D to be published on the same day. The work will be highlighted as Editors' Suggestion by Physical Review Letters and Physical Review D and in a Synopsis on the American Physical Society Physics website.

The researchers say computers employing the full power of general relativity are the key to producing more accurate results and perhaps new or deeper understanding.

"No one has modeled the full complexity of the problem before," Starkman said. "These papers are an important step forward, using the full machinery of general relativity to model the universe, without unwarranted assumptions of symmetry or smoothness. The universe doesn't make these assumptions, neither should we."

Both groups independently created software to solve the Einstein Field Equations, which describe the complicated interrelationships between the contents of the universe and the curvature of space and time, at billions of places and times over the history of the universe.

Comparing the outcomes of these numerical simulations of the correct nonlinear dynamics to the outcomes of traditional simplified linear models, the researchers found that approximations break down.

"By assuming less, we're seeing something new," Giblin said.

Bentivegna said that their preliminary applications of numerical relativity have shown how and by how much approximations miss the correct answers. More importantly, she said, "This will allow us to comprehend a larger class of observational effects that are likely to emerge as we do precision cosmology."

"There are indeed several aspects of large-scale structure formation (and their consequences on, for example, the cosmic microwave background) which call for a fully general relativistic approach," said Sabino Matarrese, professor of physics and astronomy at the University of Padua, who was not involved in the studies.

This approach will also provide accuracy and insight to such things as gravitational lensing maps and studying the cross-correlation among different cosmological datasets, he added.

The European team found that perturbations reached a "turnaround point" and collapsed much earlier than predicted by approximate models. Comparing their model to the commonly assumed homogeneous expansion of the universe, local deviations in an underdensity (a region with less than the average amount of matter) reached nearly 30 percent.

The U.S. team found that inhomogeneous matter generates local differences in the expansion rate of an evolving universe, deviating from the behavior of a widely used approximation to the behavior of space and time, called the Friedmann-Lemaître-Robertson-Walker metric.

Stuart L. Shapiro, professor of physics and astronomy at the University of Illinois at Urbana-Champaign, is among the acknowledged leaders of solving Einstein's equations on the computer. "These works are important, not only for the new results that they report, but also for being forerunners in the application of numerical relativity to long-standing problems in cosmology," said Shapiro, who was not involved in the studies.

No longer restricted by the assumptions, researchers must abandon some traditional approaches, he continued, "and these papers begin to show us the way."

Bruni said galaxy surveys coming in the next decade will provide new high-precision measurements of cosmological parameters and that theoretical predictions must be equally precise and accurate.

"Numerical relativity simulations apply general relativity in full and aim precisely at this high level of accuracy," he said. "In the future they should become the new standard, or at least the benchmark for any work that makes simplifying assumptions."

Both teams are continuing to explore aspects of the universe using numerical relativity and enhancing their codes.

Bentivegna and Bruni used the Einstein Toolkit, which is open-source, to develop theirs. The U.S. team created CosmoGRaPH and will soon make the software open-source. Both codes will be available online for other researchers to use and improve.

Media Contact

Kevin Mayhood
kevin.mayhood@case.edu
216-534-7183

 @cwru

http://www.case.edu 

Kevin Mayhood | EurekAlert!

Further reports about: Einstein Shapiro expansion of the universe general relativity

More articles from Physics and Astronomy:

nachricht NASA detects solar flare pulses at Sun and Earth
17.11.2017 | NASA/Goddard Space Flight Center

nachricht Pluto's hydrocarbon haze keeps dwarf planet colder than expected
16.11.2017 | University of California - Santa Cruz

All articles from Physics and Astronomy >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: A “cosmic snake” reveals the structure of remote galaxies

The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.

Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...

Im Focus: Visual intelligence is not the same as IQ

Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.

That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...

Im Focus: Novel Nano-CT device creates high-resolution 3D-X-rays of tiny velvet worm legs

Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.

During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....

Im Focus: Researchers Develop Data Bus for Quantum Computer

The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.

Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...

Im Focus: Wrinkles give heat a jolt in pillared graphene

Rice University researchers test 3-D carbon nanostructures' thermal transport abilities

Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

Ecology Across Borders: International conference brings together 1,500 ecologists

15.11.2017 | Event News

Road into laboratory: Users discuss biaxial fatigue-testing for car and truck wheel

15.11.2017 | Event News

#Berlin5GWeek: The right network for Industry 4.0

30.10.2017 | Event News

 
Latest News

NASA detects solar flare pulses at Sun and Earth

17.11.2017 | Physics and Astronomy

NIST scientists discover how to switch liver cancer cell growth from 2-D to 3-D structures

17.11.2017 | Health and Medicine

The importance of biodiversity in forests could increase due to climate change

17.11.2017 | Studies and Analyses

VideoLinks
B2B-VideoLinks
More VideoLinks >>>