Experts in Building Environments™

University Computing Center Expands Capacity and Still Sees 50% Reduction in Heat Generated
University Computing Center Expands Capacity and Still Sees 50% Reduction in Heat Generated
Project at a Glance
Background:

The University of Warwick’s Centre for Scientific Computing (CSC) needed to expand their data center facilities due to increased high-performance computer use by Ph.D students and post-doctoral researchers.

Issue:

The university needed to balance the need for expansion with heat control and limited floor space. They also wanted to find an energy efficient cooling solution.

Solution:

The university used an award-winning cooling system developed by Nortek Air Solutions brand Eaton-Williams. The system consists of six Rear Door Heat Exchangers (RDHx), two Cooling Distribution Units (CDU), and two Computer Room Air Conditioning Units (CRAC).

Results:

After the cooling system upgrade, the data center saw a 50% reduction in heat generated.

Removing heat and saving energy are major environmental challenges facing data centers. So when the University of Warwick in Coventry, England, needed to expand its computing facilities, it called in Eaton-Williams Group Limited, a Nortek Air Solutions brand, to supply energy-efficient cooling. The result was a 50% reduction in heat generated.

The University’s Centre for Scientific Computing (CSC) was one of the first to benefit from our Rear Door Heat Exchangers. Ph.D. students and post-doctoral researchers were rapidly increasing their use of high-performance computing resources in the CSC, meaning that the department’s existing data center facilities needed to expand.

But the university needed to balance the need for expansion with the control of heat and limited floor space. Working closely with their consulting engineers, Couch Perry Wilkes Partnership, the University of Warwick decided to use rear door heat exchangers (RDHx) and cooling distribution units (CDU), which offer a more robust, compact, and energy-efficient solution than computer room air conditioning (CRAC) units.

An Award-Winning System

In fact, the cooling system used by the University of Warwick won the Silicon Valley Leadership Group’s (SVLG) sponsored “Chill-Off” in 2008 for being the most energy-efficient data center cooling product on the market. The SVLG determined this system used just 20% of the energy used by conventional close-control air-conditioning-unit-based systems.

With six cabinets housing 40 servers in each, six RDHx units were installed in conjunction with a downflow CRAC arrangement to remove up to 15kW of heat from each rack using a traditional chilled water system.

The Advantages of the Cooling System

Each RDHx offers condensation-free operation using controlled water from the CDUs. The RDHx high performance specifications include refrigeration-grade coils pressure tested to 45, hermetic construction, sealed copper-brazed, under-floor manifolds pressure-tested to 20 bar, leak detection, and leak-free, quick-release couplings and hose sets rated to 53 bar. The secondary circuit working pressure fed by the CDUs in <4 bar gives an excellent safety margin.

The units substantially reduce the heat output from the servers, which can be in excess of 45°C (113°F), by as much as 50%, removing it from the hottest part of the servers (back) and rejecting it into the cooling coils in the rear door. The cooling coils then cool the rejected air down to nearly room temperature, approximately 20°C (68°F).

In an adjacent room, two CDUs and two CRAC units control the temperature of the water for the RDHx units. The RDHx requires no additional fans or electricity and is designed to cool without opening or removing the doors. Because the heat exchangers are in the back of the racks and in the door itself, the footprint of the racks and floor space is barely impacted.

A major benefit is that as the RDHx cools the air before it leaves the rack, there are no hot spots and cooling air at source is very energy efficient. 150kW of heat can be rejected to the primary chilled water system via RDHx and a CDU, consuming only 2.6kW of pump power. A CRAC system typically would consume around 10−15kW to do the same job, while using a much larger footprint.

This solution has enabled the University of Warwick to introduce high-density equipment with zero thermal impact on its data center and has set a benchmark that the university plans to implement in its other data centers