Data Center Environment and Energy Thresholds

If you’re running a data center, there are certain environmental and energy thresholds you need to stay in line with. If you don’t have the proper operating environment, that not only drives your operating costs through the roof; it could very well cause irreparable damage to your servers. Further, if you’re using too much energy, that’s a sure sign that your data center’s suffering from some glaring inefficiency or operations issue- which, ultimately, will hurt your bottom line.

If you’re a smart operator, these aren’t really things you can ignore.

Environmental Thresholds

So what environmental thresholds should you should for? What’s the ‘sweet spot’ for power consumption? How are they tied together?

The answer’s not actually as simple as you might think.

While it’s true that every data center should be both cool and reasonably dry, there’s a wide range of additional factors that come into play when you’re trying to determine the ideal operating conditions for your center:

  • How old is your equipment? Older equipment is more sensitive, and tends to run hotter.
  • What sort of server density is your data center running? As a general rule, more density means more heat.
  • How many hours per year do you run your servers? Believe it or not, not every Data center in the world is “always on.”
  • How intensive are the tasks you assign to your servers? Computers generate more heat when handling more intense tasks, after all.

Given all the above factors, a lot of operators err on the side of caution and run their centers at around 60 degrees Fahrenheit. Believe it or not, they actually don’t need to run that cold- ever. (unless you’re still using legacy hardware) Many operators are warming things up, raising temperatures to somewhere between 65 to 75 degrees. There’s the sweet spot, and if your data center’s running in that temperature range, you’re very likely in the clear. If it climbs above 85 degrees, you may have a problem.

Don’t err on the side of caution here, either- running too far below the threshold can actually end up costing you a pretty penny in climate control, and there’s a good chance that’s money you don’t really need to waste.

As far as humidity’s concerned, too dry is almost as bad as too wet.  Ideally, you’re going to want to shoot for somewhere between 45 to 55%. Too damp, and it’s pretty obvious what’ll happen- computers and water don’t really know how to play nice with each other. Too dry, and there’s a good chance your gear is going to fry itself as a result of electrostatic discharge.

Energy Thresholds

Power consumption’s a little trickier, but at the end of the day, it’s tied directly to efficiency. Elements which affect power consumption include:

  • Climate Control- choose an inefficient or less-than-ideal means of controlling the environment in your data center, and you’re pretty much guaranteed to have your energy consumption shoot through the roof- and well above the threshold.
  • The efficiency of your hardware- less efficient hardware will, naturally, use more power to accomplish less.
  • The uptime of your facility- this one goes without saying. The longer your uptime, the more power you’ll be using.

As far as power thresholds are concerned, we’re going to swing back to my previous article- we’re going to look at the metrics behind power consumption. This is, quite simply, the only way we can really establish an energy threshold. One metric, in particular, is of interest to us- power use effectiveness. The closer you are to 1, the more efficient your center is- and the closer you are to the ideal threshold for power consumption.

You should also keep an eye on energy provisioning. Don’t waste power- you want to be absolutely certain that you aren’t provisioning too much power, or too little. Either way, it’s likely going to cost you money. Make sure you know how much power your center uses, and ensure you aren’t provisioning for more.

In Closing

Environmental and Energy thresholds are both incredibly important factors in the smooth operation of any data center. Run above either threshold, and you not only waste money- you risk causing potentially irreparable damage to your hardware, leading to downtime that could end up costing you much more than a few million dollars.  Thankfully, if you plan ahead, they’re both pretty easy to keep track of- just implement some means of monitoring them, and work out some threshold rules.

Questions? Comments? Concerns? Swing by our forum to have a chat.

Share on TwitterSubmit to StumbleUpon

ASHRAE Mandates Use of Economizer in a Data Center

An economizer is a device that is used to reduce energy consumption in a data center. Be it a water- side economizer or an air- side economizer, with the astronomical electricity bills the data centers have to deal with, one can see why these devices are popular.

Types Of Economizer

An air- side economizer takes in the air from the atmosphere whenever the external conditions are optimal. The cool air from the outside is directly introduced into the data center without it having to go through the refrigeration cycle, thus saving energy. This is also known as free cooling. Water in a data center is used in the chiller systems. The heat exchanger uses air to cool the water in the chiller system. The water- side economizer pre-cools the water so that the cooler condenser does not have to work too hard to achieve the desired cooling levels again leading to lower energy consumption.

Why Economizer?

It is understandable that no compromise can be made on the cooling needs of a data center. But the fact still remain that they do operate at a much lower temperature than necessary. Cooling drawing almost 45 percent of the data center power is now raising concerns among businessmen and government regulatory bodies alike.

ASHRAE recently updated its standard on data center efficiency and it has mandated the use of economizers whenever and wherever possible especially in regions where the climate is cool enough to draw air directly, completely bypassing the compressor- based cooling. Institute of Energy has asked the manufacturers to come up with chiller- less designs. This not only brings down the operating costs, it also brings down the bomb of an investment that you would have to make on the chiller itself.

Is an Economizer Suitable for My Data Center?

One has to keep in mind a few drawbacks of the economizer; humidity extremes can force the economizer to turn off and fall back on compressor based cooling. The temperature of the air entering the data center may also be only a few degrees higher than the external temperature. Rigid steps must be taken to restrict the level of contamination while using an economizer.

But the question still remains on how effective or justified is ASHRAE by mandating the use of economizers in the data centers. One cannot exactly depend on the economizer one hundred percent; the system is effective only when the environmental conditions outside the data center are optimal. During the summer or in hot climates, the system is not very helpful to begin with.

Air- side economizers are more or less like automated windows that can be opened or closed at will. The dust and the pollutants that the device will bring into the data center cannot be overlooked. Accumulation of dust in the data center brings down the system efficiency of the whole data center. Even though the dust and the pollutants can be contained by installing filters, the filters should be checked and emptied/ replaced regularly.

Although the installation of an economizer is no drawback, the energy it ends up saving in an extreme climate sure is debatable. There are more than one ways to bring down the cooling costs, which are more effective than an economizer. Engineers are even coming up with effective designs for chiller- less water economizer. Explore your options before you zero in on the system of energy efficient cooling that suits your data center the best.

For more updates on data centers, visit Data Center Talk.

Share on TwitterSubmit to StumbleUpon

Data Center Cooling Solutions- An Overview

Data Center Cooling  is  the biggest challenge faced by data center specialists today. There is no ‘one size fits all’ mantra when it comes to data center cooling. It depends on a number of factors such as external temperature, the number of operating devices in the data center, and whether or not the systems are virtualized. Cooling solutions have to be custom made for every data center in order to achieve efficient and effectively running data centers.

With the government even offering tax deductions for the data centers that promote energy conservation, DCs have a lot to gain. The right cooling strategy not only cuts down on electricity bills, but also improves data center operation and increases the life of the hardware. With so many advantages, one can understand the level of importance specialists are attaching to a good cooling system.

In order to select a plan that is best suited for a data center, one needs to be aware of the types of cooling designs that are now available in the market. Given below is a list of cooling solutions one can look into before making the call. They mainly fall under two broad categories.

  •  Aisle Containment Solutions
  • Rack  Containment Solutions

Before discussing each of them in depth, I would like to digress slightly to explain how data center cooling essentially works.

In a data center, the servers or the storage devices are placed in racks. Each rack has a capacity to hold 10- 40 servers and the racks are arranged like the book shelves in a library. Servers obviously draw power to operate. Assuming that there are atleast 200servers in the data center, we would be looking at a massive rise in the temperature near the servers. This rise in temperature could even damage the servers resulting in data center down time. Installing an AC to cool the entire room is not even an option worth considering as the heat rise is different in different parts of the data center, and honestly, it is a sheer waste of energy. The need of the hour is localized cooling devices.

In case of Hot Aisle Containment (HAC) and Cold Aisle Containment (CAC), the racks are arranged in such a way that the hot air expelled by the server is contained in one aisle and the cool air provided to the server is contained in one aisle. HAC and CAC are arranged in an alternating fashion. The idea is to ensure that the hot air and cool air do not mix with each other. Why? It’s simple physics. When two glasses of water with different temperature are poured in a single container, the temperature of the resultant mixture is the average of the temperature of the two glasses. Same is the case with air. When hot air and cold air mix together, the resultant is warm air. The ACs have to pump in more cool air to reduce the temperature around the servers. Hence, there is more energy consumption.

The servers are arranged such that the front of the server to which cold air is supplied is in the cold aisle and the back of the server that expels the hot air is in the hot aisle. For instance, consider that a data center has four aisles. In this case, the DC will have two cold aisles and two hot aisles. Note that we will not have to provide cooling for the hot aisles. Thus, the DC cuts down its cooling costs by half!

In Rack Level Containment, fans are fitted to the rear of the racks in modules and hot air is drawn out from the servers and is either removed from the data center through chimneys or passed on to CRAC.

Aisle Containment Solutions

Industries offer a variety of designs under aisle containment. Eaton, for instance, offers the following solutions:

1. End of Row Doors

End of Row Doors more effective cooling aisles as they trap the cool air or hot air by blocking their escape route thus preventing mixing of air. This lets the data center operator set a higher temperature within the data center thus saving energy.

2. Horizontal Ceiling System

In this system, similar to end of row doors, air mixing is prevented by blocking the roof of the rack using clear panels. These panels can be easily mounted onto the racks. The panel is modular and scalable to accommodate differences in rack heights and row spacing.

3. Aisle Duct

The Aisle Duct is an extension of the horizontal ceiling system. Ducts are provided on the roof that enables transfer of air either from the air conditioning supply in case of cold aisles or exhaust in case of hot aisles. The design of duct is modular and scalable and can be altered to suit the data center requirement.

4. Vertical Wall System

In this system, the horizontal ceiling system acts as a supporting structure for mounting vertical walls that connect from the top of the enclosures to the data center ceiling. There is greater isolation of cold and hot air.

5. End of Row Curtains

The solutions mentioned above may be a little expensive if the data center is working on a limited budget. In such cases, one might consider End of Row Curtains; they partially contain the air within the aisle. Depending on the requirement, End of Row Curtains can be installed at the rack level, with or without an Aisle Containment Ceiling, or from floor to room ceiling.

 

Rack Containment Solutions

Rack containment solutions consist of the following design structures:

1. Heat Containment System (HCS)

The HCS contains and directs the hot air from the data center‘s IT equipment through the chimney to the existing CRAC units through a plenum ceiling or high air returns.

2. Active Thermal Management System (ATMS)

Operating all the fans at a fixed speed is a waste of energy. The speed has to be varied with respect to the temperature rise. These systems automatically adjust the power to the fans based on a set operating temperature.

3. Active Airflow Manager

Bypass airflow and mixing lead a complete breakdown in the green data center principles. Controlling airflow to locations with varying densities, varying building infrastructure and sporadic hot spots is challenging, but can be solved by allocating the correct amount of airflow at known intake locations.

 

Additionally, one might consider investing in blanking panels. In case a server needs to be taken out of the rack for maintenance or replacement, it creates an open space which allows re-circulation of hot exhaust air back to the equipment inlet. This can cause overheating of the equipment. Blanking panels provide a quick, easy and cost-effective solution to prevent this. Their purpose is to act like closed windows in a room that prevents the air transfer from one aisle to another. They are either made of steel or plastic.

Finally, it is important to note that while these systems are quite efficient on their own, a combination of cooling solutions results in more effective and efficient cooling.

Share on TwitterSubmit to StumbleUpon

Rittal Discusses Data Center Cooling

This video talks of new cooling technologies that are available in the market including cold aisle containment, free cooling, liquid cooling and other next generation CRAC systems. It also talks of ways in which optimized cooling can save your annual expenditure and improve data center efficiency.

 

For more videos on data center technologies, visit Data Center Talk.

Share on TwitterSubmit to StumbleUpon

Why Is Data Center Thermal Management Important

With the large amounts of data every company has to store, it is essential for them to keep their data safe and within reach. The problem with great amounts of data is the fact that not all the data is equally important; some data is more frequently used than other data, for example. What happens is that data that is not needed for a long period of time easily gets lost, and when it is needed again, it is impossible to track it down. Therefore, it can be seen why company owners had no other option but to find an efficient solution when it comes to keeping their precious data safe. In that respect, data centers have become widespread, as they have proven to be an excellent method for storing data and keeping it safe.

 

First off, it needs to be said that data centers are facilities commonly referred to as computer rooms, where a company keeps all of their computer equipment. Here is where all the storage and security devices are located, which is why there are certain rules in regard to maintaining data centers properly. One of the most important rules to follow regarding data centers is data center cooling.

Data Center Cooling

Data centers need to maintain a certain temperature in order for all the devices in them to be safe. Since the servers in the data center have to function round the clock, they draw a lot of power for their operation. This leads to heating up of the systems and the temperature of the systems have to be regulated constantly. This is best done by using air conditioning, as by doing so it is possible to control the temperature and humidity of the air in the room. Humidity control is important to ensure that water does not condense on internal circuits during high humidity conditions and does not lead to static discharge in low humidity conditions.

 

According to regulations, the temperature in a data center should not exceed 24 °C (75°F) and it should be kept above 16°C (61°F), which is why data center cooling is extremely important. Temperatures in data centers can easily rise, and this can cause damage to devices that are located there. Therefore, the issue of data center cooling should not be neglected, as this will help data centers run properly, and more importantly, without any malfunctions.

Data Center Talk has more to say about data center management. Read on.

Share on TwitterSubmit to StumbleUpon

Proposed “Data Furnaces” Could Use Server Heat to Warm Homes

As winter approaches, could a warm server take out the chill as opposed to a radiator or fireplace?

A new paper from Microsoft Research and the University of Virginia makes the case that servers can be sent to homes and office buildings and used as a heat source. The household data centers, which Microsoft calls “Data Furnaces”, has three main advantages over traditional data centers: a Proposed Data Furnaces, Could Use Server, Heat to Warm Homes, Data Center, power calculation, cooling system, fewer generator, Green Data Center, datacenter, data center services, data center management, about data centers, internet data centers, datacenter services, datacenter solutions Business continuitysmaller carbon footprint, reduced total cost of ownership per server, and closer proximity to users.

US Environment Protection Agency

According to figures from the US Environmental Protection Agency, the nation’s servers and data centers consumed around 61 billion kWh in 2006 — 1.5 percent of the country’s total electricity consumption. And as one of the fastest growing sectors in the US, it was estimated that national energy consumption by servers and data centers could exceed 100 billion kWh nearly double by the end of the year.

Exhaust Air Temperature

“The temperature of the exhaust air (usually around 40-50°C) is too low to regenerate electricity efficiently, but is perfect for heating purposes, including home/building space heating, cloth dryers, water heaters, and agriculture,” the study states.

While it’s most likely that early adopters will be office buildings and apartment complexes with mid-sized data centers heating them, micro-datacenters on the order of 40 to 400 CPUs could serve as the primary heat source for a single-family home. These Data Furnaces would be connected to a broader cloud via broadband, and connect to the home heating system just like any conventional electric furnace.

Microsoft is far from the only company looking to combine the cost of powering servers and heating buildings.

Reusing Data Center Heat in Office

In 2007, for instance, Intel released a study on reusing data center heat in offices, and in 2010 it opened Israel first LEED-certified green building, which featured a 700-square-meter (about 7,500 square feet) server room where heat is recycled for hot water and winter heating.

In another interesting take on reusing data center heat, the Swiss Federal Institute of Technology Zurich, and IBM built a water-cooled supercomputer that provides warmth to university buildings. Dubbed “Aquarius”, the system consumes as much as 40 percent less energy than a comparable air-cooled machine, IBM reckons.

Data Furnace ideas

Microsoft identifies some challenges to their Data Furnace idea such as how to monitor and react to local changes in power and broadband usage, physical security, and the lack of dedicated system operators in a home. What is not discussed in the report is how servers would be cooled in warmer months, the risk of fire from overheating, and the potential noise that could come from so many servers.

While there’s still work to be done, the idea that electricity demand could be curbed by harnessing the heat from data center and putting it to good use is exciting and one that we’ll be following intently.

 

To keep yourself updated on the latest happenings in the data center industry, please visit us at Data Center Talk.

Share on TwitterSubmit to StumbleUpon

Data Center Cooling

Cooling the data center effectively and efficiently has always been a hot topic among data center owners. Excess heat can make your servers fail!

Power and Cooling Management System

In this new brutal economy,  many data center managers are adopting an integrated approach to cut cost through getting the right power and cooling. Attaining the right cooling management system is crucial as heat fluxes continue to rise. As the technology improves, more people adopt products and services that could help to enhance the cooling capability of a data center. So, if you are looking for the right solution for cooling-related energy cost, you need to calculate the cooling requirements of your present and future business need. With these numbers, IT dept. can avoid the cost overruns. In the next step, you need to estimate the overall Data center cooling cost and strategies.

Data Center Cooling, Data Center Talk, data center, data center global presence, data center scaling, data center businesses New Data Center of the Future, Data Center, power calculation, cooling system, fewer generator, Green Data Center, datacenter, data center services, data center management, about data centers, internet data centers, datacenter services, datacenter solutions Business continuity
Green Revolution of cooling.

Here, I present few tips that may reduce the energy required for cooling:

  • Identify the highest temperature rack: You need to identify the server racks that need space conditioning to directly cooling.
  • Design of data center: The data center should look cool! The design of the data center is important. The proper design of layout, rack densities, and temperature control can save your time, energy and money.
  • Humidification systems: Air Conditioners consume a lot of power.
  • Air-side economizers: These economizers provides outside air to cool the building, rather than using refrigeration equipment to cool the return air.
  • Effective ways to block heat: You can try insulation, reflective barriers and shading. Trees, vines and shrubs can be used to shade can be planted around data center. However, be careful, these trees should not hinder the air flow.
  • Keep the air filters clean: Dirty filters force air to go around filtration sections and can hinder the cooling process. It will rewire more efforts to clean a dirty evaporator coil as compared to replace filters. So, it is advisable, if you routinely change filters based on the pressure drop across the filter, calendar scheduling, or visual inspection.
  • Don’t cool unused space: Sometimes data center owners install cooling equipments in each corner of the facility. The cooling equipment in unused space consume power that goes waste.
  • Set thermostat for cooling, if applicable for data center.
  • When you lowering the condenser water temperature, it will decrease the chiller energy and it can increase the cooling tower fan energy consumption.

 

You can also keep up to date with current trends and technology by visiting Data Center Talk where we keep you informed on important changes as they occur.

Share on TwitterSubmit to StumbleUpon