Reducing Electrical Consumption in Existing and New Data Centres.
Growing IT energy consumption is getting more and more attention from the bill payers across all industries. At political level data centre efficiency has been projected to the fore as one of the five main focuses of the new Smart Economy . In addition Sustainable Energy Irelands has established data centre and ICT energy efficiency working groups which are currently assisting both public and private sectors to tackle the problem of increased power usage by IT departments.
Alignment of Management Goals
In most organisations energy efficiency has become an important topic but up to now the approach has tended to be company wide with no particular focus on the IT department. This stems from a reluctance to dabble in the workings of the IT department which may effect the performance and reliability of the service it offers. In simple terms the goals of the department creating the IT power bill and those paying the power bill are not aligned. In the IT world energy efficiency will never trump reliability and performance, however it has become the next most important factor when choosing a product. This message is being heard by manufacturers who are now producing processors and servers which do more with less power and can operate at higher temperatures. This technology now offers real opportunity to reduce the IT power bill.
Fig 1. Data Centre energy reduction methodology – Start at the core
Step 1: Reducing of core IT power usage
Energy use in a data centre may be broken down into core IT power usage and ancillary power usage. Core IT power usage is the electrical energy needed to keep the IT equipment running. Ancillary power usage is the energy needed to operate the cooling, lighting, monitoring, etc and is secondary to the core IT process but as in the case of cooling is proportional to it. Hence if we reduce the core IT power usage we will also reduce the ancillary power demand which is the goal of our first step. Steps which reduce the core IT power usage are listed below.
Opportunities | Description | Potential energy saving |
Virtualisation | Physical to virtual servers | 50% |
Hardware refresh | Use servers with high SPEC_power ratio. | 20% |
Server power management | Power-down, | 10% |
Storage power management | Spin-down, de-duplication etc | 10% |
Table 1: Opportunities for reduction in IT power usage
Note : Sustainable energy Ireland working with Byrne Dixon Associates have developed minimum criteria for defining efficient server and storage equipment.
Step 2 :Control of Airflow in the Data Centre
Following a reduction in the core IT power usage further power savings can be made by reducing the power used to cool IT equipment. In many cases cooling units can be power off or returned to idle mode. We can achieve these savings by altering the room layout so that our cooling equipment cools the IT equipment and not the IT room. Through the establishment of hot and cold aisle and the reduction of bypass and recirculation air we ensure that our cooling units deliver cold air to the IT equipment only and the exhaust air returns to the cooling units as hot as possible resulting in an improved Coefficiency Of Performance. This aim is further enhanced by containing the hot / cold aisles using a proprietary or plastic curtain containment system. Addition savings are achieved by reducing air-loss through the floor or through the racks. The use of cable arms at the rear of racks increases the air resistance across the server with a resultant increase in fan power. The use of velco cable ties allows for ease of server movement in and out of the rack. Table 2 shows a list of measure aimed at controlling airflow in the data centre.
Table 2 : Opportunities for control of airflow in the data centre
Opportunities | How | Potential energy saving |
Create Hot and cold aisles. | Relocate racks and/or cooling units. | 10% |
Reduce cold air loss. | Install blanking plates and air brushes. | 5% |
Contain hot/cold aisle. | Install proprietary containment system or cheaper plastic curtain system.
Utilise the ceiling void as a return plenum for hot air. |
15% |
Remove server cable arms. | Install Velcro cable ties to allow server movement. | 3% |
|
Figure 2: Benefits of a Hot aisle/ cold aisle arrangement.
Figure 3: The importance of unhindered air supply and return.
For example, in this server room design we see examples of a clear and unhindered airflow path on the left and a restricted airflow path on the right which will cause inefficiency through air mixing and hot air recirculation.
it is important to consider the location of the cooling units relative to the server intake when laying out a server room. An effort should be made to create an unhindered cold air supply path to the server intake and an unhindered hot air return path back to the cooling unit.
Step 3: Raise the Temperature Set-Point
It is the aim of the cooling system within the IT Room to ensure that the IT equipment is kept within the manufacturers operating parameters and the recommended industry guidelines.
ASHRAE Technical Committee 9.9 which focus specifically on data centre design, recently released their “2008 ASHRAE Environmental Guidelines for Data Centre Equipment”. This guideline has expanded the recommended operational bands form the earlier 2004 guidelines in recognition of the proven ability of IT equipment to operate in higher temperature environments and the importance of power reduction:
Condition |
2004 Version |
2008 Version |
Temperature |
20°C to 25°C |
18°C to 27°C |
Humidity |
40% RH to 55% RH |
5.5°C DP to 60% RH & 15°C DP |
These new temperature recommendations allow the cold supply temperature to be raise to much higher levels than before which in turn allows for greater energy savings. This higher temperature set points increase the number of free cooling hours available to the outside condenser units resulting in drastically reduced energy bills.
The aim of the cooling unit in the server room is to cool the IT equipment an not the IT room therefore cooling unit operation should be based on temperature readings at the server intake. Room temperature should be measured at the front of the racks in the cold aisle.
Step 4:
Specification of New Equipment
Sustainable energy Ireland working with Byrne Dixon Associates have set out energy efficiency criteria for all classes of power and cooling equipment within the data centre. These criteria may be reviewed under the heading for Information and Communications Technology ICT at http://www.sei.ie/Your_Business/Accelerated_Capital_Allowance/ACA_Categories_and_Criteria
In is now possible to stipulate as part of a tender that proposed equipment should be included in the list of ACA approved equipment. By purchasing equipment which is listed on the ACA approved equipment Private Sector organisations can offset the full purchase price of the equipment against Tax in the year of purchase saving the company 12.5%
Many forms of power and cooling equipment come with energy efficient options which have short payback periods and in many cases the savings on efficiency can fund the purchase of the new equipment. Previous generations of UPS operated most efficiently at full load although most of their operating life would be spent below 50% capacity. Recent developments in modular and transformerless technology have allowed UPS units to be expanded in line with increased demand ensuring consistently high efficiency. All-on ,all-off cooling equipment has hampered cooling efficiency with its inability to track the cyclical nature of IT loads. The development of electronic computational motors and digital scroll compressors allows the cooling output to more closely demand while reducing power consumption.
Equipment type |
Feature |
Payback period |
UPS |
Modular capacity Transformerless |
12 months |
CRAC/CRAH units |
EC motors Digital screw compressors |
3 months |
Step 5: Rethinking Critical and Essential Systems
Many companies are reviewing their classification of essential and critical systems within the IT process with the aim or reducing the UPS load and the associated power losses. Not all IT processes within a data centre are critical therefore they do not need to be on UPS load but instead be on Generator power. Additionally how much redundancy is necessary? For example can an N+1 power system reduce to N during the 1% of the year when it hits peak load. By reviewing and implementing a revised UPS design organisations can vastly reduce their Capex and Opex costs.
Step 6: Utilise Free Cooling
Free cooling can used to reduce the cooling load of a date centre with through air-side or water-side economising. Airside economising offers a greater ROI as the temperature bands are wider than that of water-side economising but has its drawbacks. Fresh air at very low or high temperatures and at very low and high humidity levels requires treatment before it can be allowed into an IT environment which requires centralised plant. In new large scale data centres this investment is justified and the services space to house the plant equipment and the large air delivery ductwork can be easily designed into a new building layout. The delivery of large volumes of fresh air throughout the data centre requires large fans with large power loads which has a negative effect on the energy saving opportunity. The business case tends to work best for smaller power density data centres. The use of heat wheels can overcome the problem of humidity ingress but are more difficult to incorporate into tradition data centre layouts.
For small server rooms office air can be diverted into the cold aisle and hot return air can be reused to preheat ventilation air or it can be dumped to atmosphere during the cooling season.
In existing data centres retrofitting ductwork is seldom an option and in such cases waterside economising is favoured through the use of cooling towers or dry coolers. Free-cooling chillers are easily retrofitted and can be specified with heat exchangers which will produce an amount of hot water for domestic use in office environments.
Step 7: Use of CFD Modelling
Computer fluid dynamics is used to assist in assessing the suitability of design layouts and cooling technologies to best suit the clients needs. It allow a virtual 3D room be created in which different power per rack loads and different cooling designs can be tested . Performance of the cooling system can be compared against floor depth, hot or cold containment, above or below floor cable management, In-Row versus In -Rack cooling and many other elements. This process offers a no risk method of testing room layouts and cooling solutions before and capital investment is made.
Synopsis
The first step in any data centre energy reduction exercise is to reduce the core IT process load this in turn will reduce the cooling load. This load can be further reduced by applying recommended airflow management techniques within the data centre space. These and
1 Virtualisation
2 Apply Server and storage management
3 Refresh IT hardware
4 Remodel room to maximise savings to cooling load
5 Establish hot aisle/ cold aisle arrangement
6 Install hot/cold aisle containment
7 Turn off cooling units where possible
8 Raise server intake temperature
9 Remove cable arms
10 Install temperature monitoring in cold aisle
11 Utilise ceiling void as a return plenum
12 Install blanking panels and air brushes
13 Retrofit variable speed EC fans to CRAC units
14 Specify equipment from the ACA equipment list
15 Utilise free cooling
16 Reuse the exhaust heat.
16 Carry out a CFD analysis of possible options to ensure performance
Byrne Dixon Associates are a data centre consultancy based in Dublin and specialise the design and optimisation of data centres. Byrne Dixon Associates have recently been appointed to the Sustainable Energy Ireland working group on data centre efficiency. They have completed projects on over 70 data centres in Ireland and abroad and run data centre efficiency workshops for clients and consultants in Dublin and Cork and offer CFD modelling at these workshops. For details of the next data centre workshop contact Vincent Byrne. Email vincent@byrnedixon.com of by mobile at 086 8196868.