Views: 7 Author: Maciek Szadkowski Publish Time: 2019-12-04 Origin: https://www.linkedin.com/pulse/10-steps-implementing-data-centre-liquid-cooling-maciej-szadkowski/
Data Centre Direct Liquid Cooling systems are no longer a glimpse into the future, they are now a reality. This article presents the steps required to make the change from traditional HVAC systems. First published in 2018 November Issue of Digitalisation World magazine (https://digitalisationworld.com/). We will cover each point in detail later, but for now just 5 minute read.
There are many articles already praising liquid cooling technology, although not a single one actually provides informed advice how to implement liquid cooling technology and why data centre operators should start working now on direct liquid cooling adaptation.
Implementation of liquid cooling is cost effective and simple solution for energy cost, climate change and regulatory challenges. In addition to this, most liquid cooling systems can be implemented without disruption in operation. We will advise how to choose architecture that will support specific heat rejection scenario, what is required and how to conduct smooth transition from air cooling to liquid cooling. Let’s find out what must be done.
1. Change your mindset
None of us likes change. We like to keep things the same, we accept industry standards and preferences without the second thought and avoid unnecessary risks. To move on from Air Cooling effort must be made. Liquid cooling has evolved since being introduced in late sixties by IBM. Current DLC / ICL vendors started around 2005 and in that time have produced thousands of cooling components. We can assume that liquid cooling systems are now both proven and mature. The biggest risk related to liquid cooling is that you might not evaluate it.
2. Check the facts and standards
The Uptime Institute 2018 Data Centre Survey Results showed that 14% of datacentres have already implemented liquid cooling solutions. There will be a data centre near you that has implemented direct liquid cooling or performing Proof Of Concept implementation. From our experience – most of cloud providers like to keep their liquid cooling system secret as their competitive advantage.
In terms of standards, in 2011 ASHRAE introduced Thermal Guidelines for Liquid-Cooled Data-Processing Environments and then Liquid Cooling Guidelines for Datacom Equipment Centres 2nd Edition in 2014. Thermal Guidelines for Data Processing Environments 3rd Edition included insight into other considerations for liquid cooling covering condensation, operation, water-flow rate, pressure, velocity, and quality, as well as information on interface connections and infrastructure heat-rejection devices. Currently LCL.GOV and OCP started liquid cooling standardisation effort for wider adoption. The 2011-2014 standards have evolved, there are couple of vendors that deliver full portfolio of DLC / ILC systems.
3. Do your own research
The problem is that not many data centre infrastructure integrators know anything about liquid cooling. You will also not hear about liquid cooling from data centre HVAC people. It simply hurt’s their business.
You have to do your own research and look actively for available solutions. If you want to consider more expensive proprietary systems – Dell, Lenovo, Fujitsu and some other players like Huawei already have direct chip liquid cooling solution (and also immersion system in case of Fujitsu). These are attached to specific server models, mostly HPC platform which is a constraint obviously. However, there are already many direct chip and immersion liquid cooling vendors with complementary solutions that may be applied to your servers or can accommodate most of the servers on the market.
4. Evaluate hot coolant, direct liquid cooling systems only
There are established far from heat source liquid cooling systems (CRAH, Overhead, InRow™, Enclosed Cabinet, Rear Door Heat Exchanger), but when the heat is transferred directly from the source, the facility supply liquid temp may be “warm water” (ASHRAE Class W4: 35°C to 45°C) or even “hot water” (W5 above 45°C). With Delta T of 10 °C outlet temp may reach over 55 °C which facilitates heat re-use for building or community heating.
Don’t mix chilled water cooling or indirect LC with hot fluid direct liquid cooling. The goal is to resign completely from mechanical cooling and use liquid – free cooling combo for the whole year. We want to resign from chilled water use and extract heat directly from components, not from the air inside the rack or data centre space.
5. Asses and measure the benefits:
Benefits of liquid cooling are widely known. But what it means in reality? Well, let me provide few examples:
- Increase in rack power density (from 20 kW to 100 kW +)
- Lower data centre footprint, fewer server racks and interconnects
- 30-50 less energy use & cost
- 10-20% increase in computing power of liquid cooled processors and gpu's
- Increased reliability of equipment
- Higher power density of processors
- Fewer Pieces of Critical Equipment in Data Hall Space
- Simplified electrical and mechanical topology
- Faster go to market - reduced Site & Structural Construction Compared to Traditional Build
- Reduction/elimination of fan vibrations
- Smaller CAPEX and greatly just fraction of traditional datacentre OPEX
- Decrease Total Cost of Ownership (TCO)
- Reuse of usually wasted heat