Talk Talk’s Data Centre cooling system consumes 87% less power



Facilities manager Mark Jacobs wanted to reduce the environmental impact of the company’s data centres across the board as well as curbing energy costs. He needed a product that could achieve both these objectives whilst maintaining the critical functionality of the facility.

The TalkTalk data centres have favoured traditional DX (direct expansion) refrigeration-based cooling systems using CRAC (computer room air conditioning) units. These have a relatively low capital cost but were proving to be inefficient. Additionally, these systems do not incorporate the more advanced principle of free cooling. For these reasons TalkTalk felt that a fresh look at their cooling systems was required.

Challenges & Actions

After assessing the most energy efficient systems available, a new system incorporating free cooling was decided upon. This used cold outside air to cool the data centre for as long a period as possible before bringing in the cooling system when the outside air became warm. This approach decreases the cost and the carbon production dramatically as the cooling system is running for much shorter periods.

Two options were looked at: The first, a DX system using free cooling would allow the system to switch off the power-hungry compressor and use a simple, efficient, air-cooled circuit to provide a reasonably good coefficient of performance (CoP) of 3. Last year, in London, the ambient air temperature was below 14°C for approximately 70% of the time. In theory, this represents the amount of time when free cooling can be used.

However, because condenser units are usually placed in groups on rooftops or in outside plant areas, the only source of air into the inlet is often a mixture of ambient air with hot air from adjacent condensers.


Higher overall inlet temperatures result in a reduced free cooling band, and a relatively small amount of warm air recirculation can make a big difference. A 5°C change would HALVE the time when free cooling could be used, therefore dramatically reducing the specified performances achieved. As a result TalkTalk began looking at other solutions which avoided the possibility of recirculation and increased the amount of time free cooling could be utilised.

This second option was the EcoCooling CREC (Computer Room Evaporative Cooling) solu- tion, which incorporates evaporative cooling into a ventilation system. Evaporative cooling is a very simple method of cooling air without using refrigerants.

A direct evaporative cooler, using wetted filter pads, cools outside air by bringing it into contact with water.

Water evaporates into the air stream and cools it. The amount of cooling is dependent upon the temperature and relative humidity of the air. In the UK, the maximum theoretical air-off temperature is 22°C. In practice, the actual temperatures achieved using evaporative coolers in UK data centres can approach 24°C on the very warmest days. This allows full compliance with ASHRAE temperature standards.

Lessons & Results

For most of the time, UK ambient air temperature is colder than that required in the data centre – so a simple ventilation system can maintain compliant conditions. This can remove the need for cooling for up to 95% of the time in the UK and similar climates. The EcoCooling CREC system uses Electrically Commutated (EC) axial fans which offer the lowest energy ventilation systems on the market. A 100kW data centre like the TalkTalk’s in Ireland could be ventilated using EC fans consuming less than 5kW. This means the ventilation system would only add a maximum of 0.05 to the PUE of the data centre. In addition to this TalkTalk also looked at minimising the fan speeds greatly reduces energy use.

As a rule of thumb, a fan running at half speed uses 12.5% of the full speed power. When data centres are partly populated then variable speed EC fans exploit this principal.

The combination of a ventilation system using EC fans and evaporative cooling would provide TalkTalk with a cooling solution with a CoP of over 20 and conditions that would enable them to keep the data centre within acceptable ASHRAE conditions for 98% of the time at a fraction of the energy cost. It also provided them with an energy efficient solution to run a partially populated data centre without the drawbacks of high initial PUE’s.

Their test laboratory data centre facility at Irlam near Manchester was chosen for the first project as the existing cooling system was both under performing and consuming a lot of energy. Two EcoCooling CRECs were installed on the roof of the building as part of a fresh air ventilation system to provide 100kW of cooling to the main server room. Each system has two coolers which bring air into the building and distribute it through vents directly into the centre of the server room (which has no containment). Highly efficient EC fans are used for the supply and extract air from the building.

The existing DX system has been retained to provide back-up in the unlikely event it should be required. The new system has increased the cooling capacity and reduced the energy consumption from ~16kVA to ~2kVA (~87% power reduction).


The graph to above shows the instantaneous use of electricity for the cooling system at Irlam showing clearly the reduction achieved when the CREC system was introduced.

Two of the 3 planned sites are now completed and TalkTalk are now in the process of finishing off the installation on their third site.


“We are extremely happy with the performance of our EcoCoolers, the system consumes 87% less power than the previous DX system did and our data hall is now cooler than it ever was before. We were expecting to have to utilise the secondary system (DX) occasionally, but the EcoCooling system has performed very well and the secondary system has remained off. We are now installing EcoCooling CRECs in 2 further data centres.” Mark Jacobs, Energy & Facilities Manager.

More info:

Come and meet with EcoCooling at EMEX that will be held at ExCeL London on 11-12 November 2015 – Register for FREE now.


More case studies…