Data Centres

Data centres could rely less on chillers, but heat rejection remains essential

22 January 2026
5 minutes
Nvidia CEO Jensen Huang made comments about Rubin chips not requiring water chillers, sparking conversations over where the data centre cooling market could move next.

If chips are engineered in a way to not require cooling, the data centre market evolved to be more zero-water focused. However, heat rejection is still critical to the industry, given continued demands for AI technology.

Nvidia CEO Jensen Huang made comments at the start of the month, suggesting that the company’s next-generation Rubin chips can be cooled using water at temperatures that do not require traditional water chillers. This sparked concern in the data centre cooling market, causing several companies to witness their stocks fall.

In the wake of such comments, Capacity spoke with several data centre executives about what a zero-water industry could look like and how cooling markets could evolve in 2026.

What zero water data centres could look like

Reporting on Nvidia’s Rubin platform suggests racks can run with 45°C water, which could ultimately reduce reliance on the more traditional water chillers for many data centre designs. Indeed, Huang’s comments sparked Trane Technologies and Johnson Controls’ stocks to drop suddenly.

Ozgur Duzgunoglu, design and engineering director at Telehouse, explained that, while these designs can lower energy use and simplify cooling, high-density compute still needs a reliable path to reject heat in peak conditions.

“Zero or near zero on site water use becomes a design outcome, using dry heat rejection, air economisers and sealed loops that avoid evaporation, with liquid cooling supporting dense AI,” he said. “Next comes wider adoption, more testing of liquid cooling options and more heat reuse, supported by transparent tracking of PUE and WUE.”

The ‘hot-water cooling’ concept is a novel idea, given that data centres currently use a significant amount of power for chiller systems and provides refrigerated coolant to the servers. Hot-water cooling capability of a server can be enabled by advanced cold plates, which reduce thermal resistance between the chip and coolant, and intelligent CDUs, which coordinate the coolant flow with the power infrastructure – leading to greater efficiencies and better reliability.

“Hot-water cooling will reduce the energy needed for chiller systems greatly and enable reallocation of this energy to more AI systems, cooling distribution units (CDUs) and electrical infrastructure,” explained Dr. Peter de Bock, vice president, data center energy and cooling technology at Eaton.  This is an exciting time for innovation in data centre design and operation. This approach is estimated to have the potential for up to 33% more AI factory output per grid connection, substantially more efficient.”

Using less chillers could be a positive step in a more sustainable direction for power and water, particularly as the industry seeks to move towards warm-water direct liquid cooling. However, it doesn’t mean ‘no cooling’ or ‘zero water’ altogether.

“Heat must still be rejected and the local climate will govern whether systems can operate year-round without evaporative backup,” said Bruno Berti, senior vice president of global product management at NTT Global Data Centers.

Aligning ‘economics with sustainability’

The AI boom has inevitably prompted a new era of liquid cooling for data centres, enabling power and cooling strategies across the industry.

Speaking with Capacity earlier in the week, Motivair by Schneider Electric CEO Rich Whitmore explained their solutions are designed to keep pace with the chip and silicon evolution to be able to deliver next-gen performance.

“Data centre success now hinges on delivering scalable, reliable, efficient infrastructure solutions that match the next generation of AI Factory deployments,” he said, telling Capacity exclusively that Motivair is working “in the background to help enable these chips and technology.”

He added: “The current conversation of power and cooling is absolutely critical to enable AI infrastructure worldwide. We’re meeting that moment with proven liquid cooling solutions that scale with our customers’ needs.”

With AI continuing to shape the industry, data centre companies could find new opportunities in advancing efficiency – not just at the chip, but also at the critical power and cooling infrastructure level.

“The most exciting part about leaps in efficiency improvements is that it aligns economics with sustainability. It means that AI factories can do more with the power they have,” de Bock added.

Berti said: “The trajectory is clear: higher-temperature liquid cooling and smarter heat rejection will keep pushing facilities toward low- or no-water operation where conditions allow.

“As an industry, we must replace rumour with rigour. It’s critical that we continue to educate communities about water use – what is actually used, what is re-used and how we work to protect shared resources.”

 

Some executives featured in this article have contributed to a data centre feature in an upcoming Capacity Magazine edition. The edition will celebrate the 25th anniversary of our MetroConnect USA event. Find out more information below.

 

Related stories

AI data centre boom strains power grids and cooling supply chains: report

EMEA key to unlocking data centre potential, Vertiv EMEA president says

Marc Ganzi: Why power & AI will redefine the data centre industry

Metro Connect USA 2026

23 February 2026

Metro Connect USA is the largest executive-level digital infrastructure event in the U.S. The only one of its kind, this 25-year-strong gathering is where decision makers come together to make deals happen.