I read an article stating that if we increase the server room base temperature to about 80 degrees, we can save a lot on annual energy costs. Microsoft tried this by increasing their server room temperature 4-5 degrees. I don't know if this is actually a good thing to do. I believe that this could prove risky and if the room is safe and secure at 80 why did we initially have lower temperatures. What are your comments on this?
These days it is generally more useful to manage intake temperatures at the zone or rack level, instead of at the room level, since local hot spots can occur even in over-cooled rooms.
This makes sense, but you have to understand the room itself is going to get really hot too. And this is not a good environment for people to work in. Have you ever thought about cycling outside air in to cool the room when the outside temprature is below 70 or so degrees and reverting to powered cooling when that temprature reaches above?
So I was doing some reading and came across an article about Intel and their using outside air to supply a datacenter with "Air Side Economization". They did a 10 month experiment with running 900 servers in new mexico while not using any climate control, fancy intake filters, humidifiers, or dehumidifiers.
The results are quite interesting as they did not see a significant rise in the failure rate of the hardware, which is impressive since it varied in humidities from 4% - 90+% and temperatures from 64 - 92 degrees and made the servers collect a layer of dust.
So I was doing some reading and came across an article about Intel and their using outside air to supply a datacenter with "Air Side Economization". They did a 10 month experiment with running 900 servers in new mexico while not using any climate control, fancy intake filters, humidifiers, or dehumidifiers.
The results are quite interesting as they did not see a significant rise in the failure rate of the hardware, which is impressive since it varied in humidities from 4% - 90+% and temperatures from 64 - 92 degrees and made the servers collect a layer of dust.
Interesting. While I think we could do something about that dust issue with a simple HEPA filter, I'm not so sure I would want my servers running at 92 degrees 24 hours a day. What I was reffering to was a solution that came on when the air temprature was in the 70's or even lower than that to keep a constant temprature of 68-72 degrees. As for humidity, unless it is really off the scale then its not of that much importance as the servers heat will keep them reasonably dry.
That's good to know. I imagine it wouldn't be very comfortable to work in that environment though.
Though I do know that keeping the ambient room temperature around 80 works great in even large scale, high density data centers. In an 140,000 sq/ft data center I worked in there was a time where we kept the temp around 80 and experienced very low failure rates.
I think the catch was that there was equal amounts of pressurization so that we didn't run into problems with dense heat zones. We had an elaborate duct work under the floor that more or less tried to keep air pressure even across the floor. Some areas had less pressure, but it wasn't too terrible.
Understand the need to be cost effective, but not willing to change the room temperature in the data center to appease this issue. I think that is fine in other areas but not in my server room, as you have both equipment and often people that need to stay cooler.