

A Decade of Resilient, Future-Proof Cooling at Cambridge Powered by ColdLogik and Legrand
Project at a Glance | |
---|---|
Location | West Cambridge, United Kingdom |
End User | University of Cambridge |
Application | High-Performance Computing (HPC), Research & Teaching IT Infrastructure |
Legrand Solution | ColdLogik Rear Door Heat Exchangers (RDHx) with Direct Liquid Cooling (DLC) |
Key Results | From 900kW to 1.8MW of IT load in ten years — achieved with seamless upgrades, higher rack densities, and best-in-class energy efficiency. |
Where Cambridge’s Journey Began
Supporting business operations, research, teaching, and learning.
From the outset, the University adopted a hybrid cooling strategy based on a ‘warm’ chilled water system and free cooling principles. This early commitment to energy efficiency significantly reduced power consumption and carbon emissions, while paving the way for future adaptability. Over a decade later, the site remains a testament to forward-thinking design and engineering. More than a decade later, The West Cambridge Data Center (WCDC) continues to evolve, with USystems playing a central role in enabling upgrades that maintain performance, increase capacity, and prepare the facility for the future. At the heart of this system is ColdLogik from USystems. Through a long-term, collaborative relationship, USystems has helped the University optimise its infrastructure, support evolving IT loads, and maintain best-in-class energy efficiency.

ColdLogik's Cooling Solutions
To support the University of Cambridge's computing needs,
The ColdLogik RDHx system enabled cabinet densities to increase, all while preserving cooling efficiency.
Power is supplied via 11,000KV feeds from UK Power Networks (UKPN), feeding into two separate substations and a 3,150kVA transformer.
USystems supported the upgraded design and installation of the SFN.
Cooling & Power
From the beginning, the University made a bold design decision to operate within the ASHRAE A2 temperature range. Allowing room temperatures of up to 27°C. This enabled a shift away from traditional cooling approaches toward a more innovative, flexible model.
While multiple solutions were explored, including all-air indirect evaporative cooling, the final design used four hybrid dry coolers, configured for N+1 resilience. These supported a chilled water system that delivers 100% free cooling, while significantly reducing power consumption and emissions. To handle the demands of High Performance Computing (HPC), a rear-door heat exchanger (RDHx) solution from USystems was adopted following a successful trial at another university site. The ColdLogik RDHx system enabled cabinet densities to increase from 30kW to 44kW, and data hall capacity to rise from 900kW to 1.2MW all while preserving cooling efficiency. Close collaboration with USystems allowed the University team to fine-tune system parameters and achieve peak operational performance.
Electrical Infrastructure
Power is supplied via 11,000KV feeds from UK Power Networks (UKPN), feeding into two separate substations and a 3,150kVA transformer. Initial capacity is 2,200kVA, with headroom for 3,000kVA as availability from UKPN increases.
Backup power is provided by three 1,100kVA generator sets, configured to N+1 ensuring that only two are required to maintain operation, with fuel reserves for 72 hours of runtime. Uninterruptible power is delivered through three 1,000kVA modular UPS systems (N+1), each consisting of five 200kVA modules with intelligent controls. The UPS systems operate at 98% efficiency. Power is distributed through dual A and B feeds, using a Starline busbar system for flexibility. Each rack is powered and metered by intelligent Raritan cabinet PDUs, ensuring both resilience and real-time monitoring.
Upgrades & DLC Deployment
2020: DLC PILOT DEPLOYMENT
In late 2020, USystems joined a joint project with Dell, CoolIT, and the University to explore direct liquid cooling (DLC) within the existing HPC facility home to over 100 ColdLogikcooled racks at the time. The pilot introduced five rack-mounted 4U CDUs, each rated at 200kW and serving two server racks via internal manifolds. Each rack operated at roughly 60kW (approx. 40kW covered by DLC and 18–20kW by conventional IT loads). To accelerate deployment, the CDUs were supplied from the return water line of the existing ColdLogik infrastructure, delivering water at 27–30°C (5–7°C below the CDU’s preferred supply temperature, but still within workable range). Each CDU included dual pumps, dual power supplies, and primary/secondary water circuits. To accommodate the CDUs, ColdLogik cooler flow control valves were adjusted to a 55% minimum opening, slightly overcooling the system but without adverse effects, as the adiabatic free cooling source supplies water at a steady 21°C. Over the following two years, the setup expanded with three more rack pairs added, demonstrating the approach’s effectiveness and adaptability.
2023: DAWN SUPERCOMPUTER AND SECONDARY FLUID NETWORK
In 2023, the University received funding to deploy a new Supercomputer, DAWN. The system required a fully redundant Secondary Fluid Network (SFN), designed with future expansion in mind. USystems supported the design and installation of the SFN, which included:
Four large rack-mounted CDUs to serve the entire facility
A fully plumbed, redundant secondary water loop
Continuous operation throughout installation, no downtime to core systems
The SFN went live in November 2023, powering 16 new high-density Dell DLC racks. Each rack now operates at approximately 80kW, with 60kW handled by DLC and the remaining 20kW managed by ColdLogik RDHx units. All previously installed DLC servers were transferred to the SFN, and extensive pipework and CDU filtration upgrades were completed—again, without interrupting live operations.
Conclusion: A Decade of Evolution
Ten years on, the West Cambridge Data Centre remains one of the UK’s most energy-conscious and future-ready facilities.
USystems’ ColdLogik solutions have consistently delivered the flexibility and performance needed to support both high-density IT growth and the adoption of emerging technologies like DLC. With upgrades underway to support up to 1.8MW of IT load, the next chapter is already in motion.
Together, these solutions demonstrate how a foundation built in 2013 remains relevant in 2023 and with Legrand, we continue to go further for the future of HPC and AI-ready infrastructure.
Discover some of our other cases
_____

Data Center Insights: Edition 5
