While most personal computers today can process a few hundred thousand calculations per second, computer scientists are laying the groundwork for exascale machines that will process more than a million trillion – or 10^18 – calculations per second. Just a few months ago, scientists reached the long-sought-after high-performance computing milestone of one petaflop by processing more than a thousand trillion – or 10^15 – calculations per second.
“The need for exascale-sized machines is well-established,” said Karsten Schwan, a professor in the School of Computer Science in the College of Computing at the Georgia Institute of Technology. “With exascale machines, weather simulations will be able to operate at finer resolution, biologists will be able to model more complex systems, and businesses will be able to run and manage many applications at the same time on a single large machine.”
Schwan recently received a 2008 HP Labs Innovation Research Award to work with HP Labs, HP’s central research arm, to help solve some of the key problems in developing exascale machines. The high-impact research award, one of only two granted for exascale research and 41 granted overall to professors around the world, encourages open collaboration with HP Labs. The award amount is renewable for a total of three years based on research progress and HP business requirements.
With the petaflop barrier broken, researchers like Schwan are focusing on the next goal – improving that processing power a thousandfold to reach the exascale. Schwan’s expertise in high performance and enterprise computing will help him solve some of the challenges surrounding exascale systems.
“We believe that machines will reach exascale size only by combining common chips – such as quad core processors – with special purpose chips – such as graphics accelerators,” said Schwan, who is also director of the Georgia Tech Center for Experimental Research in Computer Systems (CERCS).
A challenge that arises from this scenario is how to efficiently run programs on these heterogeneous many-core chips. To investigate possible methods for doing this, Schwan will team with Georgia Tech School of Electrical and Computer Engineering professor Sudhakar Yalamanchili, an expert in heterogeneous many-core platforms.
Exascale machines must also be able to run multiple systems and applications on a single platform at the same time, while guaranteeing that they will not interfere with each other. An approach called virtualization may help solve this challenge by hiding some of the underlying computer architecture issues from applications.
“With virtualization, decisions have to be made about where, when and for how long certain programs should run, but there are many ways of determining what might be appropriate because there might be multiple goals,” explained Schwan. “For instance, one might want to minimize the exascale machine’s power consumption while at the same time meet some performance goal for the application. In other words, virtualized systems must be actively ‘managed’ to attain end user, institutional or corporate goals.”
Ada Gavrilovska, a specialist in virtualization and multi-core operation and research scientist in the School of Computer Science in the College of Computing, will collaborate with Schwan to determine how to manage multiple programs on exascale machines that consist of hundreds of thousands of processors.
Though exascale machines are high-performance computing systems, the vision for these future systems goes beyond the typical vision painted for high performance computing. Instead of scaling a single program to run on hundreds of thousands of cores, exascale systems will also be used to run multiple programs on a single large machine.
“This future virtualized and managed exascale system will guarantee some level of service even when parts of the machine get too loaded or too hot or fail, since applications can be moved while they are running,” said Schwan.
Though it will be several years before exascale systems are developed, scientists at Georgia Tech will use the HP Labs Innovation Research Award to lay the foundation for solving emerging science and engineering challenges in national defense, energy assurance, advanced materials and climate.
“Around the world, HP partners with the best and the brightest in industry and academia to drive open innovation and set the agenda for breakthrough technologies that are designed to change the world,” said Prith Banerjee, senior vice president of research at HP and director of HP Labs. “HP Labs’ selection of Karsten Schwan for a 2008 Innovation Award demonstrates outstanding achievement and will help accelerate HP Labs’ global research agenda in pursuit of scientific breakthroughs.”
Abby Vogel | Newswise Science News
Green Light for Galaxy Europe
15.03.2018 | Albert-Ludwigs-Universität Freiburg im Breisgau
Tokyo Tech's six-legged robots get closer to nature
12.03.2018 | Tokyo Institute of Technology
Animal photoreceptors capture light with photopigments. Researchers from the University of Göttingen have now discovered that these photopigments fulfill an...
On 15 March, the AWI research aeroplane Polar 5 will depart for Greenland. Concentrating on the furthest northeast region of the island, an international team...
The world’s second-largest ice shelf was the destination for a Polarstern expedition that ended in Punta Arenas, Chile on 14th March 2018. Oceanographers from...
At the 2018 ILA Berlin Air Show from April 25–29, the Fraunhofer Institute for Laser Technology ILT is showcasing extreme high-speed Laser Material Deposition (EHLA): A video documents how for metal components that are highly loaded, EHLA has already proved itself as an alternative to hard chrome plating, which is now allowed only under special conditions.
When the EU restricted the use of hexavalent chromium compounds to special applications requiring authorization, the move prompted a rethink in the surface...
At the ILA Berlin, hall 4, booth 202, Fraunhofer FHR will present two radar sensors for navigation support of drones. The sensors are valuable components in the implementation of autonomous flying drones: they function as obstacle detectors to prevent collisions. Radar sensors also operate reliably in restricted visibility, e.g. in foggy or dusty conditions. Due to their ability to measure distances with high precision, the radar sensors can also be used as altimeters when other sources of information such as barometers or GPS are not available or cannot operate optimally.
Drones play an increasingly important role in the area of logistics and services. Well-known logistic companies place great hope in these compact, aerial...
16.03.2018 | Event News
13.03.2018 | Event News
08.03.2018 | Event News
16.03.2018 | Earth Sciences
16.03.2018 | Physics and Astronomy
16.03.2018 | Life Sciences