Science and engineering are advancing rapidly in part due to ever more powerful computer simulations, yet the most advanced supercomputers require programming skills that all too few U.S. researchers possess. At the same time, affordable computers and committed national programs outside the U.S. are eroding American competitiveness in number of simulation-driven fields.
These are some of the key findings in the International Assessment of Research and Development in Simulation-Based Engineering and Science, released on Apr. 22, 2009, by the World Technology Evaluation Center (WTEC).
"The startling news was how quickly our assumptions have to change," said Phillip Westmoreland, program director for combustion, fire and plasma systems at the National Science Foundation (NSF) and one of the sponsors of the report. "Because computer chip speeds aren't increasing, hundreds and thousands of chips are being ganged together, each one with many processors. New ways of programming are necessary."
Like other WTEC studies, this study was led by a team of leading researchers from a range of simulation science and engineering disciplines and involved site visits to research facilities around the world.
The nearly 400-page, multi-agency report highlights several areas in which the U.S. still maintains a competitive edge, including the development of novel algorithms, but also highlights endeavors that are increasingly driven by efforts in Europe or Asia, such as the creation and simulation of new materials from first principles.
"Some of the new high-powered computers are as common as gaming computers, so key breakthroughs and leadership could come from anywhere in the world," added Westmoreland. "Last week's research-directions workshop brought together engineers and scientists from around the country, developing ideas that would keep the U.S. at the vanguard as we face these changes."
Sharon Glotzer of the University of Michigan chaired the panel of experts that executed the studies of the Asian, European and U.S. simulation research activities. Peter Cummings of both Vanderbilt University and Oak Ridge National Laboratory co-authored the report with Glotzer and seven other panelists, and the two co-chaired the Apr. 22-23, 2009, workshop with Glotzer that provided agencies initial guidance on strategic directions.
"Progress in simulation-based engineering and science holds great promise for the pervasive advancement of knowledge and understanding through discovery," said Clark Cooper, program director for materials and surface engineering at NSF and also a sponsor of the report. "We expect future developments to continue to enhance prediction and decision making in the presence of uncertainty."
The WTEC study was funded by the National Science Foundation, Department of Defense, National Aeronautics and Space Administration, National Institutes of Health, National Institute of Standards and Technology and the Department of Energy
Joshua A. Chamot | EurekAlert!
NASA CubeSat to test miniaturized weather satellite technology
10.11.2017 | NASA/Goddard Space Flight Center
New approach uses light instead of robots to assemble electronic components
08.11.2017 | The Optical Society
The formation of stars in distant galaxies is still largely unexplored. For the first time, astron-omers at the University of Geneva have now been able to closely observe a star system six billion light-years away. In doing so, they are confirming earlier simulations made by the University of Zurich. One special effect is made possible by the multiple reflections of images that run through the cosmos like a snake.
Today, astronomers have a pretty accurate idea of how stars were formed in the recent cosmic past. But do these laws also apply to older galaxies? For around a...
Just because someone is smart and well-motivated doesn't mean he or she can learn the visual skills needed to excel at tasks like matching fingerprints, interpreting medical X-rays, keeping track of aircraft on radar displays or forensic face matching.
That is the implication of a new study which shows for the first time that there is a broad range of differences in people's visual ability and that these...
Computer Tomography (CT) is a standard procedure in hospitals, but so far, the technology has not been suitable for imaging extremely small objects. In PNAS, a team from the Technical University of Munich (TUM) describes a Nano-CT device that creates three-dimensional x-ray images at resolutions up to 100 nanometers. The first test application: Together with colleagues from the University of Kassel and Helmholtz-Zentrum Geesthacht the researchers analyzed the locomotory system of a velvet worm.
During a CT analysis, the object under investigation is x-rayed and a detector measures the respective amount of radiation absorbed from various angles....
The quantum world is fragile; error correction codes are needed to protect the information stored in a quantum object from the deteriorating effects of noise. Quantum physicists in Innsbruck have developed a protocol to pass quantum information between differently encoded building blocks of a future quantum computer, such as processors and memories. Scientists may use this protocol in the future to build a data bus for quantum computers. The researchers have published their work in the journal Nature Communications.
Future quantum computers will be able to solve problems where conventional computers fail today. We are still far away from any large-scale implementation,...
Pillared graphene would transfer heat better if the theoretical material had a few asymmetric junctions that caused wrinkles, according to Rice University...
15.11.2017 | Event News
15.11.2017 | Event News
30.10.2017 | Event News
20.11.2017 | Earth Sciences
20.11.2017 | Earth Sciences
20.11.2017 | Life Sciences