Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

Can our computers continue to get smaller and more powerful?

14.08.2014

University of Michigan computer scientist reviews frontier technologies to determine fundamental limits of computer scaling

From their origins in the 1940s as sequestered, room-sized machines designed for military and scientific use, computers have made a rapid march into the mainstream, radically transforming industry, commerce, entertainment and governance while shrinking to become ubiquitous handheld portals to the world.


Advanced techniques such as "structured placement," shown here and developed by Markov's group, are currently being used to wring out optimizations in chip layout. Different circuit modules on an integrated circuit are shown in different colors. Algorithms for placement optimize both the locations and the shapes of modules; some nearby modules can be blended when this reduces the length of the connecting wires.

Credit: Jin Hu, Myung-Chul Kim, Igor L. Markov (University of Michigan)

This progress has been driven by the industry's ability to continually innovate techniques for packing increasing amounts of computational circuitry into smaller and denser microchips. But with miniature computer processors now containing millions of closely-packed transistor components of near atomic size, chip designers are facing both engineering and fundamental limits that have become barriers to the continued improvement of computer performance.

Have we reached the limits to computation?

In a review article in this week's issue of the journal Nature, Igor Markov of the University of Michigan reviews limiting factors in the development of computing systems to help determine what is achievable, identifying "loose" limits and viable opportunities for advancements through the use of emerging technologies. His research for this project was funded in part by the National Science Foundation (NSF).

"Just as the second law of thermodynamics was inspired by the discovery of heat engines during the industrial revolution, we are poised to identify fundamental laws that could enunciate the limits of computation in the present information age," says Sankar Basu, a program director in NSF's Computer and Information Science and Engineering Directorate. "Markov's paper revolves around this important intellectual question of our time and briefly touches upon most threads of scientific work leading up to it."

The article summarizes and examines limitations in the areas of manufacturing and engineering, design and validation, power and heat, time and space, as well as information and computational complexity.​

"What are these limits, and are some of them negotiable? On which assumptions are they based? How can they be overcome?" asks Markov. "Given the wealth of knowledge about limits to computation and complicated relations between such limits, it is important to measure both dominant and emerging technologies against them."

Limits related to materials and manufacturing are immediately perceptible. In a material layer ten atoms thick, missing one atom due to imprecise manufacturing changes electrical parameters by ten percent or more. Shrinking designs of this scale further inevitably leads to quantum physics and associated limits.

Limits related to engineering are dependent upon design decisions, technical abilities and the ability to validate designs. While very real, these limits are difficult to quantify. However, once the premises of a limit are understood, obstacles to improvement can potentially be eliminated. One such breakthrough has been in writing software to automatically find, diagnose and fix bugs in hardware designs.

Limits related to power and energy have been studied for many years, but only recently have chip designers found ways to improve the energy consumption of processors by temporarily turning off parts of the chip. There are many other clever tricks for saving energy during computation. But moving forward, silicon chips will not maintain the pace of improvement without radical changes. Atomic physics suggests intriguing possibilities but these are far beyond modern engineering capabilities.

Limits relating to time and space can be felt in practice. The speed of light, while a very large number, limits how fast data can travel. Traveling through copper wires and silicon transistors, a signal can no longer traverse a chip in one clock cycle today. A formula limiting parallel computation in terms of device size, communication speed and the number of available dimensions has been known for more than 20 years, but only recently has it become important now that transistors are faster than interconnections. This is why alternatives to conventional wires are being developed, but in the meantime mathematical optimization can be used to reduce the length of wires by rearranging transistors and other components.

Several key limits related to information and computational complexity have been reached by modern computers. Some categories of computational tasks are conjectured to be so difficult to solve that no proposed technology, not even quantum computing, promises consistent advantage. But studying each task individually often helps reformulate it for more efficient computation.

When a specific limit is approached and obstructs progress, understanding the assumptions made is key to circumventing it. Chip scaling will continue for the next few years, but each step forward will meet serious obstacles, some too powerful to circumvent.

What about breakthrough technologies? New techniques and materials can be helpful in several ways and can potentially be "game changers" with respect to traditional limits. For example, carbon nanotube transistors provide greater drive strength and can potentially reduce delay, decrease energy consumption and shrink the footprint of an overall circuit. On the other hand, fundamental limits--sometimes not initially anticipated--tend to obstruct new and emerging technologies, so it is important to understand them before promising a new revolution in power, performance and other factors.

"Understanding these important limits," says Markov, "will help us to bet on the right new techniques and technologies."

-NSF-

Media Contacts
Steve Crang, University of Michigan, (734) 763-9996, scrang@umich.edu
Aaron Dubrow, NSF, (703) 292-4489, adubrow@nsf.gov

Principal Investigators
Igor Markov, University of Michigan, (734) 936-7829, imarkov@umich.edu

Related Websites
Igor Markov's research page: http://web.eecs.umich.edu/~imarkov/
Limits on fundamental limits to computation: http://1.usa.gov/1lVqbAF

The National Science Foundation (NSF) is an independent federal agency that supports fundamental research and education across all fields of science and engineering. In fiscal year (FY) 2014, its budget is $7.2 billion. NSF funds reach all 50 states through grants to nearly 2,000 colleges, universities and other institutions. Each year, NSF receives about 50,000 competitive requests for funding, and makes about 11,500 new funding awards. NSF also awards about $593 million in professional and service contracts yearly.

Aaron Dubrow | Eurek Alert!

Further reports about: Foundation NSF Science fundamental materials obstacles physics smaller technologies transistors

More articles from Physics and Astronomy:

nachricht A 100-year-old physics problem has been solved at EPFL
23.06.2017 | Ecole Polytechnique Fédérale de Lausanne

nachricht Quantum thermometer or optical refrigerator?
23.06.2017 | National Institute of Standards and Technology (NIST)

All articles from Physics and Astronomy >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Can we see monkeys from space? Emerging technologies to map biodiversity

An international team of scientists has proposed a new multi-disciplinary approach in which an array of new technologies will allow us to map biodiversity and the risks that wildlife is facing at the scale of whole landscapes. The findings are published in Nature Ecology and Evolution. This international research is led by the Kunming Institute of Zoology from China, University of East Anglia, University of Leicester and the Leibniz Institute for Zoo and Wildlife Research.

Using a combination of satellite and ground data, the team proposes that it is now possible to map biodiversity with an accuracy that has not been previously...

Im Focus: Climate satellite: Tracking methane with robust laser technology

Heatwaves in the Arctic, longer periods of vegetation in Europe, severe floods in West Africa – starting in 2021, scientists want to explore the emissions of the greenhouse gas methane with the German-French satellite MERLIN. This is made possible by a new robust laser system of the Fraunhofer Institute for Laser Technology ILT in Aachen, which achieves unprecedented measurement accuracy.

Methane is primarily the result of the decomposition of organic matter. The gas has a 25 times greater warming potential than carbon dioxide, but is not as...

Im Focus: How protons move through a fuel cell

Hydrogen is regarded as the energy source of the future: It is produced with solar power and can be used to generate heat and electricity in fuel cells. Empa researchers have now succeeded in decoding the movement of hydrogen ions in crystals – a key step towards more efficient energy conversion in the hydrogen industry of tomorrow.

As charge carriers, electrons and ions play the leading role in electrochemical energy storage devices and converters such as batteries and fuel cells. Proton...

Im Focus: A unique data centre for cosmological simulations

Scientists from the Excellence Cluster Universe at the Ludwig-Maximilians-Universität Munich have establised "Cosmowebportal", a unique data centre for cosmological simulations located at the Leibniz Supercomputing Centre (LRZ) of the Bavarian Academy of Sciences. The complete results of a series of large hydrodynamical cosmological simulations are available, with data volumes typically exceeding several hundred terabytes. Scientists worldwide can interactively explore these complex simulations via a web interface and directly access the results.

With current telescopes, scientists can observe our Universe’s galaxies and galaxy clusters and their distribution along an invisible cosmic web. From the...

Im Focus: Scientists develop molecular thermometer for contactless measurement using infrared light

Temperature measurements possible even on the smallest scale / Molecular ruby for use in material sciences, biology, and medicine

Chemists at Johannes Gutenberg University Mainz (JGU) in cooperation with researchers of the German Federal Institute for Materials Research and Testing (BAM)...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

Plants are networkers

19.06.2017 | Event News

Digital Survival Training for Executives

13.06.2017 | Event News

Global Learning Council Summit 2017

13.06.2017 | Event News

 
Latest News

Quantum thermometer or optical refrigerator?

23.06.2017 | Physics and Astronomy

A 100-year-old physics problem has been solved at EPFL

23.06.2017 | Physics and Astronomy

Equipping form with function

23.06.2017 | Information Technology

VideoLinks
B2B-VideoLinks
More VideoLinks >>>