Why Data Center Cooling is Absolutely Essential in 2020

Jan 22 2020

data center cooling

Data centers use a ton of power. It’s no surprise that they do. Consider the amount of computing power a data center manages, all while fitting onto a single data floor. Then there’s the infrastructure required to cool and maintain the perfect operating environment for all the equipment in use.

In total, data centers consume nearly 3% of all the electricity used in the world. We can expect that their power use will continue to increase as additional energy-intensive facilities are built in the years to come.

Fully understanding the power and cooling characteristics of your infrastructure is important. It helps you assess potential future costs and computing needs, all while understanding what requirements will be needed to cool it.

In this article, Nortek – Reznor, the leader in HVAC technology, will walk you through what you need to know about data center cooling, and why it’s absolutely essential in 2020 and beyond.

Cooling Technology

Power densities are rapidly increasing. Because of this, a lot of businesses have started to invest heavily in new cooling technologies that will allow them to harness the power of the next generation of computer processors.

Big tech companies such as Google have started leveraging the power of AI to help improve their efficiency with cooling. Previous pie-in-the-sky solutions such as liquid server systems are beginning to see the light of day, offering highly innovative ways to cool the new generation of processors.

The recent application of predictive analytics is, by far, one of the most exciting innovations in infrastructure management. The data centers of today generate huge amounts of information about their cooling and power needs. Facilities that want to reach the highest efficiency levels now harness data usage and trends, which allow them to manage their cooling needs and data center power output proactively.

Modern data centers have figured out how to improve efficiency scores by anticipating when cooling and power needs will be at their highest, then cycling their servers down during periods of low traffic.

Defining the Cooling Technologies

Due to the importance of your cooling infrastructure, it’s a good idea to take a look at commonly used and brand new cooling technologies available to you.

1. CVC (Calibrated Vectored Cooling)

This is a technology designed for servers that are high-density. The system works to optimize airflow through equipment, which allows your cooling system to manage heat more effectively.

2. Chilled Water Systems

This is most commonly used in mid- to large-scale data centers. It uses chilled water to cool down air that’s brought in by the CRAH. The water supply comes from a chiller plant that’s in a different area of the facility.

3. Cold Aisle/Hot Aisle

This uses alternating rows of cold aisles and hot aisles. Each cold aisle features a cold air intake on the front of the rack. The hot aisle consists of hot air exhausts on the back of the rack.

Any empty racks are filled to prevent cold air from being wasted or overheating from occurring.

4. CRAC (Computer Room Air Conditioner)

This is a highly common feature of most data centers. A CRAC unit is similar to a standard air conditioner. It’s powered by a compressor and draws warm air across a cooling unit.

In general, they’re quite inefficient with energy use, but the equipment is inexpensive.

5. CRAH (Computer Room Air Handler)

Your CRAH unit works as a component of your broader chilled water plant system (sometimes referred to as a chiller). Chilled water will flow through the cooling coil inside the CRAH, which in turn utilizes several modulating fans that draw in air from the outdoors.

Because a CRAH functions by chilling the outdoor air, CRAH units are exponentially more efficient when they’re used in colder climates.

6. Direct-to-Chip

This cooling method uses pipes that deliver coolant to a specified cold plate that’s incorporated directly into the processor of a motherboard. It’s used to disperse heat.

Because this system directly cools the processor, it’s a highly effective way to cool servers in 2020.

7. Cooling That’s Evaporative

This cooling process manages temperatures by exposing very hot air to water. The process causes water to evaporate and draw heat from the air.

Water can be introduced in the form of wet material or through a misting system. The system is highly energy-efficient but requires a lot of water.

8. Free Cooling

This is the term for any cooling system that uses outdoor air to bring cooler air into your servers rather than continuously cooling indoor air.

Again, this works more efficiently in colder environments and can be a very energy-efficient solution in the right environment.

9. Immersion Systems

Immersion systems are highly innovative solutions that actually submerge computer hardware into non-flammable, non-conductive dielectric fluids.

10. Liquid Cooling Systems

This is any cooling system technology that utilizes liquid to pull heat from the air.

11. Raised Floor Systems

Raised floor systems lift the floor of the data center off of your building’s slab floor. The area between the data center floor and the building’s floor is then used for water cooling pipes and an increase in airflow.

Liquid Cooling Technology and Data Centers

Air cooling technology has seen vast improvements over recent years. However, it’s still limited due to several fundamental problems.

Leaving aside the high cost of energy, and air conditioning system takes up a ton of space, brings moisture into a sealed environment, and has routine mechanical failures.

Until recently, most data centers had no other options but air cooling technologies to meet cooling demands. The new developments and technologies with liquid cooling allow data centers to experiment with methods that can better solve overheating issues.

Earlier versions of liquid cooling were:

  • Messy
  • Complex
  • Cost-prohibitive

However, the current generation is much more effective and efficient. Unlike air cooling systems, liquid cooling is scalable, more targeted, and cleaner. The most common designs are direct-to-chip cooling and immersion cooling.

The Energy Demands of AI

The biggest concern for present and future data centers is efficiency. The new generation of powerful processors that run machine learning AI and analytics programs have huge energy demands and give off massive amounts of heat.

From the high-performing CPUs that are accelerated by GPU (graphics processing unit) accelerators to Google’s custom-built TPUs (Tensor Processing Units), the muscle that powers AI is causing a big strain on power and cooling capacities in many data enter environments.

As more organizations and businesses implement AI-as-a-service, demands will continue to increase. This is one big reason why data centers need to be cooled in 2020 and beyond.

Even Google wasn’t prepared for its cooling needs when it implemented its third-generation TPU processors. They generated so much heat that their current cooling system couldn’t keep up. They needed to implement direct-to-chip cooling to solve the problem immediately.

Higher-Density Server Demands

Even if your data center isn’t facilitating AI and machine learning, your server rack storage densities are probably increasing at a rapid pace. Daily workloads are growing. Because of this, many providers have started looking at replacing less efficient cooling systems.

Many data centers are engaged in an “arms race” of sorts while they look to increase the density of their racks in order to provide more comprehensive, robust services to customers. As power demands continue to increase, air cooling technologies that aren’t efficient will no longer get the job done on their own, although they will still play a role, as they do offer clean air solutions as well.

Edge Computing

A liquid cooling system can provide similar cooling performance as an air system but may have a significantly smaller footprint. This helps make liquid systems a good solution for edge data centers, which are smaller and house more high-density hardware.

It’s best to design edge data centers with liquid cooling systems from the ground up. This allows them to produce a high level of computer power while confined to a small space.

Understand that liquid cooling techniques aren’t going to replace air cooling systems anytime in the near future fully. This is partially due to the clean air solutions that air cooling systems supply. But liquid cooling is becoming a more attractive solution for many businesses.

6 Mistakes to Avoid in Data Center Cooling

Efficiency is highly critical when you’re looking to control cooling costs while delivering consistent uptimes. Here are six mistakes you need to avoid:

1. Poor Layout of the Cabinet

An effective layout needs to include a cold-aisle/hot-aisle design. At the end of each row will be a CRAH. Avoid using an island configuration. It’s inefficient.

2. Leaving Empty Cabinets

An empty cabinet skews your airflow. It allows hot exhaust air to seep back into cold weather aisles. If you do have any empty cabinets, ensure that the cold air is fully contained.

3. Empty Spaces

Always avoid leaving empty or uncovered spaces between your hardware. Empty spaces ruin the management of airflow if cabinet spaces are left unsealed, hot air leaks back into cold aisles.

4. Raised Floor Leaks

These leaks happen when the cold air leaks that are under the raised floor get into adjacent spaces and support columns. The leaks cause pressure losses, which allow warm air, humidity, and dust (not filtered out by clean air solutions) to get into cold aisle environments.

To resolve these phantom leaks, you’ll need to perform an inspection of all support columns and perimeters, then seal all leaks.

5. Cable Opening Leaks

You’ll notice how many openings you have for cable management in your cabinets and floors. If they are unsealed, the holes allow cold air to escape, thus making your system less efficient.

6. Multiple CRAH Systems Fighting Humidity

What do you think happens when one CRAH is working to dehumidify air while another CRAH is trying to add humidity to the air simultaneously? The result is a ton of wasted energy, while the two CRAH units fight against each other.

The Keys to Optimizing Your Data Center Cooling System in 2020

Your first step toward optimizing your data center cooling system and clean air solutions is to understand how data center cooling needs to work for your specific technical and business needs.

Here are some best practices you can use to ensure optimal data center cooling and clean air solutions:

1. Containment Measures to Take

As you’ve probably experienced, hot air is very stubborn. It has an annoying tendency to move wherever and however, it desires. To combat this, you may need to install doors and walls that direct airflow, keeping your cold air in cold aisles and hot air in hot aisles.

This practice will allow you to run a higher rack density and reduce overall energy consumption.

2. Inspect and Seal Leaks in All Support Columns, Cable Openings and Perimeters

Water damage is a big problem for data centers. It’s the second leading cause of downtime and data loss, topped only by electrical fires. It’s important to note that water damage isn’t typically covered by insurance policies. As such, the vast majority of data centers can’t afford to overlook this potential threat.

It’s good to know that most water leaks can be detected and prevented with some caution and forethought. Essential tools such as zone controllers, humidity sensors, and fluid and chemical-sensing cables help locate leaks and potential leaks before they pose a problem.

3. Humidity Control Point Synchronization

A lot of data center cooling systems use air-side economizers. This is also referred to as free air-side cooling.

While these improve efficiency by introducing outdoor air into your data center, they can also allow moisture to seep inside.

When you get excess moisture in the air, it can lead to condensation, which will short out and corrode your electrical systems.

To combat this, many businesses adjust climate controls. However, this can lead to other problems. When the air becomes overly dry, static electricity begins to build. This causes damage to vital equipment.

Therefore, it’s key that the data center’s humidity controls fully account for outside air moisture to help maintain an optimal server room environment.

Maximizing Your Data Center Cooling in 2020

Datacenter power demands will continue to increase in 2020 and beyond. To keep up with the demands, you’ll need to implement new data center cooling technologies if you want to continue to operate at peak capacity.

As you can now clearly see, the question of data center cooling in 2020 isn’t “if you should,” but “how you will.”

Once you have determined the best plan of action for your data center cooling needs in 2020, contact Nortek Air Solutions, the leader in HVAC technology.