Forum for Science, Industry and Business

Sponsored by:     3M 
Search our Site:

 

New Techniques Will Control Heat in Data Centers

04.06.2009
Approximately a third of the electricity consumed by large data centers doesn’t power the computer servers that conduct online transactions, serve Web pages or store information.

Instead, that electricity must be used for cooling the servers, a demand that continues to increase as computer processing power grows.

And the trend toward cloud computing will expand the need for both servers and cooling.

At the Georgia Institute of Technology, researchers are using a 1,100-square-foot simulated data center to optimize cooling strategies and develop new heat transfer models that can be used by the designers of future facilities and equipment. The goal is to reduce the portion of electricity used to cool data center equipment by as much as 15 percent.

“Computers convert electricity to heat as they operate,” said Yogendra Joshi, a professor in Georgia Tech’s Woodruff School of Mechanical Engineering. “As they switch on and off, transistors produce heat, and all of that heat must be ultimately transferred to the environment. If you are looking at a few computers, the heat produced is not that much. But data centers generate heat at the rate of tens of megawatts that must be removed.”

Summaries of the research have been published in the Journal of Electronic Packaging and International Journal of Heat and Mass Transfer and presented at the Second International Conference on Thermal Issues in Emerging Technologies, Theory and Applications. The research has been sponsored by the U.S. Office of Naval Research, and by the Consortium for Energy Efficient Thermal Management.

Five years ago, a typical refrigerator-sized server cabinet produced about one to five kilowatts of heat. Today, high-performance computing cabinets of about the same size produce as much as 28 kilowatts, and machines already planned for production will produce twice as much.

“Some people have called this the Moore’s Law of data centers,” observed Joshi, who is also the John M. McKenney and Warren D. Shiver Chair in the School of Mechanical Engineering. “The growth of cooling requirements parallels the growth of computing power, which roughly doubles every 18 months. That has brought the energy requirements of data centers into the forefront.”

Most existing data centers rely on large air conditioning systems that pump cool air to server racks. Data centers have traditionally used raised floors to allow space for circulating air beneath the equipment, but cooling can also come from the ceilings. As cooling demands have increased, data center designers have developed complex systems of alternating cooling outlets and hot air returns throughout the facilities.

“How these are arranged is very important to how much cooling power will be required,” Joshi said. “There are ways to rearrange equipment within data centers to promote better air flow and greater energy efficiency, and we are exploring ways to improve those.”

Before long, centers will likely have to use liquid cooling to replace chilled air in certain high-powered machines. That will introduce a new level of complexity for the data centers, and create differential cooling needs that will have to be accounted for in the design and maintenance.

Joshi and his students have assembled a small high-power-density data center on the Georgia Tech campus that includes different types of cooling systems, partitions to change room volumes and both real and simulated server racks. They use fog generators and lasers to visualize air flow patterns, infrared sensors to quantify heat, airflow sensors to measure the output of fans and other systems, and sophisticated thermometers to measure temperatures on server motherboards.

Beyond studying the effects of alternate airflow patterns, they are also verifying that cooling systems are doing what they’re supposed to do.

Because tasks are dynamically assigned to specific machines, heat generation varies in a data center. Joshi’s group is also exploring algorithms that could help even out the computing load by assigning new computationally-intensive tasks to cooler machines, avoiding hot spots.

Another issue they’re studying is what happens when utility-system power to a data center is cut off. The servers themselves continue to operate because they receive electricity from an uninterruptible power supply. But the cooling equipment is powered by backup generators, which can take minutes to get up to speed.

During the brief time without cooling, heat builds up in the servers. Existing computer models predict that temperatures will reach dangerous levels in a matter of seconds, but actual measurements done by Joshi’s graduate students show that the equipment can run for as much as six minutes without cooling.

“We’re developing models for different parts of the data center to learn how they respond to changes in temperature,” said Shawn Shields, a former graduate student in Joshi’s lab. “Existing models consider that temperature changes across a server rack will be instantaneous, but we’ve found that it takes quite a relatively long time for the server to reach a steady state.”

Beyond reducing cooling load, the researchers are also looking at how waste heat from data centers can be used. The problem is that the heat is at relatively low temperatures, which makes it inefficient to convert to other forms of energy. Options may include heating nearby buildings or pre-heating water, Joshi said.

Data obtained by the researchers with thermometers and airflow meters is being used to validate computer models that are reasonably accurate, but run rapidly. In the future, these models will help data center operators do a better job of optimizing cooling in real time, he said.

Joshi believes there’s potential to reduce data center energy consumption by as much as 15 percent by adopting more efficient cooling techniques like those under development in his lab.

“Our data center laboratory is a complete sandbox in which we can study all sorts of options without affecting anybody’s computing projects,” he added. “We can look at interesting ways to improve rack-level cooling, liquid cooling and thermoelectric cooling.”

John Toon | Newswise Science News
Further information:
http://www.gatech.edu

More articles from Power and Electrical Engineering:

nachricht Waste from paper and pulp industry supplies raw material for development of new redox flow batteries
12.10.2017 | Johannes Gutenberg-Universität Mainz

nachricht Low-cost battery from waste graphite
11.10.2017 | Empa - Eidgenössische Materialprüfungs- und Forschungsanstalt

All articles from Power and Electrical Engineering >>>

The most recent press releases about innovation >>>

Die letzten 5 Focus-News des innovations-reports im Überblick:

Im Focus: Neutron star merger directly observed for the first time

University of Maryland researchers contribute to historic detection of gravitational waves and light created by event

On August 17, 2017, at 12:41:04 UTC, scientists made the first direct observation of a merger between two neutron stars--the dense, collapsed cores that remain...

Im Focus: Breaking: the first light from two neutron stars merging

Seven new papers describe the first-ever detection of light from a gravitational wave source. The event, caused by two neutron stars colliding and merging together, was dubbed GW170817 because it sent ripples through space-time that reached Earth on 2017 August 17. Around the world, hundreds of excited astronomers mobilized quickly and were able to observe the event using numerous telescopes, providing a wealth of new data.

Previous detections of gravitational waves have all involved the merger of two black holes, a feat that won the 2017 Nobel Prize in Physics earlier this month....

Im Focus: Smart sensors for efficient processes

Material defects in end products can quickly result in failures in many areas of industry, and have a massive impact on the safe use of their products. This is why, in the field of quality assurance, intelligent, nondestructive sensor systems play a key role. They allow testing components and parts in a rapid and cost-efficient manner without destroying the actual product or changing its surface. Experts from the Fraunhofer IZFP in Saarbrücken will be presenting two exhibits at the Blechexpo in Stuttgart from 7–10 November 2017 that allow fast, reliable, and automated characterization of materials and detection of defects (Hall 5, Booth 5306).

When quality testing uses time-consuming destructive test methods, it can result in enormous costs due to damaging or destroying the products. And given that...

Im Focus: Cold molecules on collision course

Using a new cooling technique MPQ scientists succeed at observing collisions in a dense beam of cold and slow dipolar molecules.

How do chemical reactions proceed at extremely low temperatures? The answer requires the investigation of molecular samples that are cold, dense, and slow at...

Im Focus: Shrinking the proton again!

Scientists from the Max Planck Institute of Quantum Optics, using high precision laser spectroscopy of atomic hydrogen, confirm the surprisingly small value of the proton radius determined from muonic hydrogen.

It was one of the breakthroughs of the year 2010: Laser spectroscopy of muonic hydrogen resulted in a value for the proton charge radius that was significantly...

All Focus news of the innovation-report >>>

Anzeige

Anzeige

Event News

ASEAN Member States discuss the future role of renewable energy

17.10.2017 | Event News

World Health Summit 2017: International experts set the course for the future of Global Health

10.10.2017 | Event News

Climate Engineering Conference 2017 Opens in Berlin

10.10.2017 | Event News

 
Latest News

Electrode materials from the microwave oven

19.10.2017 | Materials Sciences

New material for digital memories of the future

19.10.2017 | Materials Sciences

Physics boosts artificial intelligence methods

19.10.2017 | Physics and Astronomy

VideoLinks
B2B-VideoLinks
More VideoLinks >>>