CyrusOne Inc.

07/18/2024 | News release | Distributed by Public on 07/18/2024 10:58

Liquid Cooling in Hyperscale Data Centers: Innovation for the Future (Part 1 of 3)

Summary

  • AI, machine learning, and high-performance computing are creating cooling challenges for data center designers and operators.
  • With the rise in rack densities and temperatures, liquid cooling is gaining traction in the world of HPC data centers, especially for the needs of hyperscalers.
  • In response to these challenges, CyrusOne offers Intelliscale™, a state-of-the-art, artificial intelligence (AI) workload-specific data center solution developed specifically to address the rapidly growing needs of AI applications and services.

The advent of generative AI and machine learning (ML) has led to an unprecedented demand for high-performance computing (HPC) and its impact on the data center industry cannot be understated. The processing of large datasets, complex algorithms, and real-time data analytics integral to AI and ML operations requires a more robust computing infrastructure than in the past. As these powerful systems crunch complex algorithms, they generate significant heat, necessitating efficient cooling solutions. One such solution is liquid cooling, a technique that is gaining traction in the world of HPC data centers, especially for the needs of hyperscalers.

"For CyrusOne, I am confident that we are keeping pace and well prepared to support customer requirements for liquid cooling," said CyrusOne CEO Eric Schwartz in a recent interview with Data Center Frontier. "The industry as a whole is pursuing multiple approaches and designs for liquid cooling, and our Intelliscale design and capabilities and well-suited to support many different topologies at high levels of performance."

Network World recently reported that 22% of data centers are using liquid cooling, according to IDC analyst Sean Graham. The global data center liquid cooling market was estimated at $2 billion in 2022 and is expected to grow at a compound annual growth rate of 15% between 2023 and 2032, according to analysis by Global Market Insights.

Liquid cooling uses a liquid coolant to absorb and dissipate heat from data center components. Unlike air, liquids can carry away more heat more efficiently due to their higher heat capacity. This method is more effective and allows for closer component packing, reducing space requirements and facilitating better performance from existing data center infrastructure.

There are four primary methods of liquid cooling that are utilized in the data center environment, immersion cooling, in-rack liquid cooling, cold-plate cooling, and direct-to-chip cooling.

  • Immersion Cooling: Liquid immersion cooling involves submerging IT components, such as servers or entire racks, in a dielectric (inert, non-conductive) fluid. This fluid efficiently absorbs and dissipates heat generated by the components, providing a highly effective cooling technique that makes it ideal for high-performance computing environments.
  • In-Rack Liquid Cooling: In-rack liquid cooling solutions bring the cooling directly to the rack level, ensuring targeted cooling of high-density deployments.
  • Cold-Plate Cooling: Cold-plate cooling involves attaching a cold plate directly to the component or processor. This method is particularly effective for cooling high-performance processors.
  • Direct-to-Chip Cooling: Direct-to-chip cooling involves circulating coolant directly to the heat-generating components. This method offers unparalleled efficiency and cooling effectiveness.

Liquid cooling offers several advantages over traditional methods. Due to its higher heat capacity liquid cooling can efficiently remove heat, keeping the systems at optimal temperatures. Liquid cooling systems are also more compact, allowing for better space utilization within the data center. There are maintenance and return-on-investment benefits as maintaining optimal temperatures can improve the performance and lifespan of the equipment. It is also more energy-efficient, leading to potential energy savings and a reduced carbon footprint that can support ESG initiatives.

Despite its benefits, implementing liquid cooling comes with its own set of challenges, especially related to retrofitting the infrastructure of existing data centers. Liquid cooling requires significant changes in infrastructure design, which can be costly and time-consuming.

Implementing liquid cooling in a data center requires careful consideration of space and layout, and maintenance requirements. Each factor must be balanced to achieve the optimal cooling solution for the data center's needs. In addition to the data center consideration, the rack and server design and deployment can also be more intricate. The quality of the implementation is also critical.

At CyrusOne, we're committed to innovative cooling solutions. We've prioritized liquid cooling capabilities in all our new data center designs and, when cost-effective, in retrofitting older facilities. We are a leader in driving innovation and design to overcome the challenges particularly when it comes to cooling and the needs of our customers.

While traditional cooling methods will continue in the data center market, the transition to liquid cooling will be more rapid than previous infrastructure technologies due to its significant advantages and the focus on environmental impact, energy efficiency, and sustainability in the data center ecosystem. This makes liquid cooling, especially for applications like generative AI, the future of cooling environments. The demand will also drive innovation and efficiency in development of technology reducing the costs and complexity of implementation and reducing the challenges of deploying liquid cooling solutions. The benefits-increased efficiency, space optimization, enhanced performance, and environmental impact- will continue to make liquid cooling a worthwhile investment.

As technology continues to advance, so too will our cooling solutions and CyrusOne plans to lead the way in ensuring that our data centers can keep up with the ever-increasing demands of high-performance computing.