high denisty data center northc

Article

High density data center solutions for AI & HPC

AI and high-performance computing (HPC) demand more power per rack. As a result, the need for high-density data centers is growing. Yet many existing facilities were designed for lower loads. What if your environment is still low-density and a customer suddenly needs high-density? In this article, we explain what “high-density” really means, how cooling and power need to evolve, and how you can scale up an existing site without turning everything upside down. We close with a real-world example from our data center near Basel.

 

What do we mean by a high-density data center?

A high-density data center is one where power per rack sits well above the usual range. Most data halls run at about 3-12 kW per rack. AI and high-performance computing (HPC) often push this to tens of kilowatts, with training peaks approaching or exceeding 100 kW. This shift affects everything: power distribution, cooling, and daily operations.

In short:

  • Density shift: 3 to 12 kW per rack in conventional data halls; as high as 100 kW per rack for AI/HPC.
  • More heat, more often: higher continuous thermal load plus peak events.
  • Implication: air-only cooling is frequently not enough.

 

Why AI and HPC change the cooling equation

As models and datasets grow, accelerators (GPUs/TPUs) draw more power and generate more heat for longer periods. To keep performance stable and hardware reliable, most organisations need to move beyond air-only to liquid-assisted cooling.

  • Higher thermal loads: AI/HPC racks run hotter than traditional servers.
  • Rising energy use: accelerator-rich nodes lift overall facility demand.
  • Sustained peaks: training and simulations mean prolonged high-intensity windows.

 

A practical path to high-density

A straightforward step for existing data halls is to install rear-door heat exchangers (RDHx). A heat exchanger at the back of the rack captures server exhaust and cools it with water before the air returns to the room. Because water absorbs about 3,000 times more heat than air, you can achieve significant efficiency gains without redesigning the entire hall. RDHx can also be combined with direct-to-chip liquid cooling for the hottest components as densities increase.

 

Designing for high-density: power, operations, compliance

  • Power delivery: plan for sustained high loads and peaks; revisit distribution, redundancy and protection.
  • Operations: liquid-assisted cooling shifts routines from “keep water out of the data hall” to controlled water circuits at the rack or component level; scale and rollout are phased and technology-dependent, so processes and training are essential.
  • Compliance & data residency: upgrades should maintain national and EU requirements while providing the low-latency AI workloads need.

NorthC and Legrand: high-density upgrade in Münchenstein

NorthC and Legrand enabled a high-density data center for AI and HPC at our Münchenstein (Basel) site in six months.

  • Challenge: Low-density room, tight timeline
  • Solution: Rear door heat exchangers, phased rollout
  • Result: Lower cooling energy, better PUE

No questions and answers were found that match your search query. Please try a different search term.

Frequently asked questions about high-density computing in our data centers

High-density computing concentrates more power and heat per rack than conventional setups. This FAQ explains what it is, why it matters for AI and HPC, and how to plan cooling, power, and operations in new or existing facilities.

High-density computing packs more compute into fewer racks by clustering GPUs and other accelerators. It increases thermal density and continuous load, changing how you plan power, cooling, and daily operations.

A common threshold is 15-20 kW per rack. For AI and high-performance computing, plan for about 100 kW per rack today. With liquid cooling, some deployments already exceed 100 kW per rack, and densities are expected to keep rising.

AI and HPC workloads are growing rapidly and run on GPUs and other accelerators that draw more power. Larger models and longer training cycles increase heat and continuous load, making high-density designs necessary.

Cooling and power delivery become the primary constraints, and operations need to adapt. The old idea of keeping water out of the data hall no longer fits many environments. To prepare existing sites, use modular designs and phased retrofits with upgraded power distribution and liquid-assisted cooling.

Rear Door Heat Exchangers (RDHx) provide indirect liquid cooling at the rack level and are a practical retrofit for existing data rooms. For very high densities, add Direct-to-Chip Liquid Cooling on the hottest components.

There are also immersion cooling solutions for even higher densities and specific use cases. Want to know more about cooling options for AI and high-performance computing? Read our blog on Immersion Cooling and Liquid Cooling.

Read our blog post

Request a quote