Damien Spillane

15 October 2013 —  Data centres can be made more energy efficient if they take advantage of fresh cool air for thermal cooling, argues Damien Spillane of Digital Realty Australia. Following is extract from a white paper he authored.

According to a report by the Climate Group, the global information technology sector is responsible for two percent of global greenhouse gas emissions. Given the widespread adoption of technology throughout society today, reliance on the data centres that support the IT operations of enterprises as well as the everyday demand of consumers around the world is only going to increase.

In fact, according to the results of studies commissioned by Digital Realty and independently conducted by Campos Research & Analysis in January 2013, demand for data centres is on the rise in North America, Europe and the Asia-Pacific region. Seventy-eight percent of respondents in the Australian study, for example, expect to expand data centre space by the end of the year. The primary drivers of demand are consistent regardless of geography: big data, the cloud, data centre consolidation, mobile computing and, consequentially, the need for greater energy efficiency.

Making data centres more efficient

A growing number of data centre owner/operators around the world are keen to limit their carbon footprints as much as possible by developing long-term strategies focused on sustainable operations and products. According to a report by Microsoft: “Both cloud computing and sustainability are emerging as transformative trends in business and society. Most consumers (whether they are aware of it or not) are already heavy users of cloud-enabled services, including email, social media, online gaming and mobile applications. The business community has begun to embrace cloud computing as a viable option to reduce costs and to improve IT and business agility.”

There have been efforts to improve the efficiency of the electrical and mechanical systems used in data centres. A failure within electrical distribution systems, for example, presents the greatest risk of downtime to the data centre infrastructure, so improvements in energy efficiency of electrical systems need to be carefully balanced against this risk. Designing and building a lower-resilience electrical system can improve efficiencies (ie, by reducing the redundant paths and components, and operating the electrical string at capacity), but also increase the risk of downtime.

The general efficiency deficits typically experienced in electrical systems are not significant, with minimal losses from the distribution cabling and distribution boards. Electrical shortfalls are mainly attributed to transformers and uninterruptable power supply systems. Newer UPS units, which take a more modular approach and employ internal bypass and instant switching, present a potential improvement in terms of efficiency in this area.

Clearly, energy consumption within data centres has received a high level of attention at both a commercial and corporate level. Of the remaining factors that can influence data centre efficiency and reduce [power usage effectiveness], energy expended in cooling is the most significant contributor. According to a report by Google, the single “biggest opportunity for efficiency is through free cooling— greater than all other areas combined.” Traditionally, the internal temperatures at such facilities have been moderated using precision cooling topologies that, quite simply, expend a lot of energy. Hence, the focus has shifted profoundly toward the machines and systems charged with monitoring and maintaining the environments (temperatures and humidity levels) within data centres.

Opening the window

Using free air to cool data centres was not a viable option prior to 2011, when the American Society of Heating, Refrigerating and Air-Conditioning Engineers Technical Committee 9.9 expanded the environmental range for data centres. ASHRAE TC 9.9’s overall goal is to offer guidance to data centre owner/operators on maintaining high reliability and operating their data centres in the most energy-efficient manner.

The internal temperature and humidity of data centres prior to ASHRAE TC 9.9 had been maintained to narrow windows (between 20 and 25 degrees Celsius and relative humidity of between 40 and 55 per cent), which meant that only precision cooling systems could be deployed in sealed computer rooms. Invariably, these systems involved direct expansion cooling with computer room air conditioning units that rejected the heat via compressor-driven refrigerant circuits, or chilled water computer room air handling units that rejected the heat via a chiller/cooling tower.

Data centres were cooled with direct-expansion compressor-driven circuits at all times, with little consideration given to using low ambient conditions to reduce energy consumption. The tight window of environmental control was driven by IT equipment requirements, whereby these conditions were necessary to maintain the reliability of the rack-mounted equipment.

The increase in the temperature and humidity windows, done with the consensus of the major data centre IT equipment manufacturers, was a milestone change in the approach to generally risk-averse data centres. Key attributes of the change include the increased options around the type of equipment deployed, which are classified A1 to A4, as well as defining the control point as the server inlet (ie, supply or cold aisle). This change has been an inflection point for the industry and, although some legacy facilities and conservative operators have been slower to adopt the new ASHRAE recommendations, there has been widespread acceptance of them. This, in turn, has enabled the deployment of new, more efficient cooling topologies (including free-air cooling) and, fundamentally, has helped lower the energy consumption and overall cost of data centre operations.

The impact of humidity control and the expansion of the overall control window has had a very specific impact on the use of free-air cooling systems. Maintaining a narrow humidity band in a data centre, as required under traditional control regimes, ensured that a fully sealed data centre shell was required. With the increase in the humidity window, data centres can now be coupled directly with outside air – albeit heavily filtered outside air.

According to the ASHRAE report: “For a majority of US and European cities, and even some Asian cities, it is possible to build data centres that rely almost entirely on the local climate for their cooling needs. The use of air-side economisation (and water-side economisation with a cooling tower) versus dry-cooler type water-side economisation also increases the number of available locales for chiller-less facilities.”

Deploying and using free-air cooling

When looking to deploy free-air cooling in a data centre, the key starting point is comprehensive climate analysis of the proposed site to ensure the environment is suitable. Initially, the incumbent weather conditions are assessed at a macro level to determine whether the locale’s temperature and humidity are conducive to free-air cooling. This is a multifaceted analysis that looks at the average and annual hourly conditions from a statistical psychometric perspective.

For example, in locations such as Singapore and Hong Kong, the humidity levels (extended periods of humidity saturation) would be such that a direct free-air cooling system would be impractical due to the high de-humidification load, whereas in dryer climates that would have similar temperature conditions (such as Dallas, Texas), free-air cooling systems can be and already have been successfully deployed.

The availability and long-term, commercial availability of water is also a consideration when choosing the free-air cooling strategy, particularly when looking at adiabatic cooling (ie, evaporative cooling). Australia is presently categorised as a high-risk area for water deficiency, as the country has fewer water resources resulting from low rains, global warming and environmental stress.

“Drought, climate change and water scarcity make water reform and improved water management more necessary than ever… it is in the interests of all Australians that we make the most of our precious water resources and plan for sustainable water use.”

Based on a life cycle analysis, using water for heat rejection in Australia proved not to be a viable solution when considered against the already-significant advantages of free-air cooling.

Next, a site’s microclimate is assessed, which can be significantly impacted by proximity to the sea, other large bodies of water or undulating terrain. When Digital Realty was assessing site viability in Australia, for instance, the company undertook detailed microclimate analysis of potential locales in Sydney and Melbourne. This helped ensure the success of deploying free-air cooling and also discounted any proximity site risks. The facilities were located on sites inland, as conditions closer to the ocean can be subject to more humid conditions, hence reducing the number of free-air cooling hours available.

Using outside air to cool data centres is a relatively straightforward process, but it is something that needs to be planned from the outset and designed into the master plan of a facility. This is due to the fact that the cooling medium is air delivered solely by large air-handling units with integrated economisers, which should be in close physical proximity to the data centre space. These large air-handling units can be mounted on top of or alongside the data centre space, and although the overall spatial requirement is lower than legacy cooling systems, it does require the closer proximity. Because of these requirements, retrofitting free-air cooling technology to a legacy site can be prohibitive and complicated.

There are a number of mechanisms employed to deliver free-air cooling to data centres, including economisers, indirect free air, thermal wheels and plate-heat exchangers. Each of these mechanisms represents different characteristics, but they are all based on the principle of using air as the sole cooling medium. Following analysis of all options, the direct system presents a compelling case due to the efficiency gains (ie, there is no heat-exchange loss through the heat exchangers or thermal wheels, and additional fan pressure drops) and the simplicity of the system (ie, fewer moving parts means there is less that can go wrong).

Rooftop air-handling units are equipped with standard mechanical components to enable the delivery of cooled air, which greatly simplifies the operating principles of the units. Air is supplied to the data hall via the supply air fan, and returned via the extract fans. The units will use ambient air when they can, and the economiser cycle will operate at the most efficient mode possible, ensuring an optimum PUE.

Key items to focus on when deploying this technology are:

  • Efficiency of the fans: Given that the fans are constantly running and essentially replace the DX cycle as an operational base load, ensuring they operate efficiently at all ranges and use variable-flow technology is vital.
  • Comprehensive filtration: With this technology, designing and maintaining the filtration on the air- handling units is vital. ASHRAE and other professional bodies make minimal recommendations that should be, at the very least, met, and ideally, exceeded.
  • Air-flow effectiveness: It is important to ensure that the air flow from the roof-mounted units is delivered to the white space with as little pressure drop as possible, but also in a way that enables the flexibility to deploy high- density racks where required.
  • Monitoring and sensing: Through a comprehensive system of decentralised controls, unit cycling, multiple redundant sensing locations (including weather stations) and air quality analysis ensure that the system is autonomous and can operate at maximum efficiency in all conditions.
  • Alignment of service-level agreement (SLA) conditions and the system operation: This is a fundamental requirement in the successful utilisation of free-air cooling and one that is often inadequately resolved.

Damien Spillane is the head of sales engineering for Australia at Digital Realty

Read the full white paper.

Leave a comment

Your email address will not be published. Required fields are marked *