Go Back   Data Center, Colocation, Cloud Computing, Storage, Dedicated Servers Forums > General DataCenter Discussion Forum > Data center general discussion and solution

Reply

 

Thread Tools
  #1  
Old 10-24-2008, 02:14 AM
Neoeclectic Neoeclectic is offline
Member
 
Join Date: Oct 2008
Posts: 85
Default Rack configuration and cooling

Currently, I have a 1U spacer between my servers. The reason for that is because the air flow from the raised flooring has been very weak. Does anyone know if there's actually any benefit to racking servers with a 1U space in between for cooling purposes? And if so is there any official documentation I can read about this?

I'm contemplating using 1U spacer blanks or reracking in a pizza box structure. These cabinets are mixed density of Dells, IBMS, HP's, and Sun servers. I could really use some insight. Thanks!
Reply With Quote
  #2  
Old 10-24-2008, 06:11 PM
KenB's Avatar
KenB KenB is offline
Administrator
 
Join Date: Jan 2006
Location: Pittsburgh, PA
Posts: 468
Default

My suggestions:
- Increase airflow to server racks-- seal floor penetrations, remove all perforated tiles except those in front of server racks, remove subfloor obstacles, seal off subfloor areas not used by servers, check CRAC units for proper operation, increase CRAC fan speed, if possible, or add additional CRACs, etc.
- Segregate supply and return air -- arrange racks in hot/cold aisles, create contiguous rows (no gaps), use blanking panels in all unused rack positions, install return air ducts to prevent recirculation of exhaust air into server intakes.
- Install servers as close to floor as possible (don't leave spaces between them)

Ken
Reply With Quote
  #3  
Old 10-25-2008, 10:47 AM
Keith's Avatar
Keith Keith is offline
Administrator
 
Join Date: Aug 2006
Location: Washington DC Metro Area
Posts: 225
Send a message via AIM to Keith Send a message via MSN to Keith
Default

Wow Ken, that is quite the list!

Here is another long post from me... Please bare with me!

Neo,

There are a couple of critical items that Ken mentioned that I stress upon every datacenter manager I come across. I dumbed down my section a bit for general purposes since it is all not aimed directly at you. I am sure you probably don't need as much detail as I am providing; since a lot of it is common sense.

1) Perforated tiles... I do not know why people make it so that the entire floor is perforated! Use only the amount that is needed. Air flow will suck unless pressure is able to build under the floor. I also notice that customers of mine tend to put perforated tiles in the hot aisle... They are called hot aisles for a reason... Putting positive pressure on a row full of hot air will only cause the hot air to move to places of lesser pressure. Also, it wont kill you pay less attention to cooling the "ambient" spaces of the datacenter. By this, I mean take out perforated tiles that are in common areas and/or large open spaces not occupied by running equipment. This will allow you to force more air which will cause the servers to output lower temperature air.

2) Spacer Blanks... Should you use them? Absolutely! I am torn on whether or not to put a space between all of the servers (and having a blank panel) or having the servers stacked together with blanks on the remainder of the open space above and below the servers. Heres my internal debate...You stack the servers on top of each other without spacing in between and you have the possibility for thermal heat transfer. If you stack them with a 1u gap and 1u blank panel, the air will still accumulate (but not recirculate) in those voids. So the difference between the 2... Thermal transfer of course can be a concern... My bigger concern is the rebreathing that is common across all datacenters.


Now I have a few additions that I would like to add.

1) Cable management... One thing I will admit to is being anal retentive when it comes to cabling and the general maintenance of the back of server racks. How is your cabling? Is it managed? Are there large bundles of wire that may restrict some exhaust? Air can't go in unless air can come out first.

2) Cabinet Doors... While I can admit that putting doors on cabinets sure is perty, I am a firm believer in not using them if you have airflow issues. The airflow resistance on a cabinet increases a significant amount with the doors closed; especially in the back of the cabinet. Like cables, the cabinet doors on the back of the rack may cause the hot air to not be fully expelled from the cabinet and in to the hot aisle for recirculation.

3) Side panels... My belief on cabinet side panels is that they were invented with panels on the side for a reason. While it may be convenient for running cables between racks (without using a tray of some sort), it opens up a lot of potential for the ever so famous rebreathing theory. Many new cabinets have cutouts on the sides for routing cables through the side panels.

4) Properly manage your space plan. Its a balancing act, one that I quite enjoy playing at times. Take inventory of your space plan and balance servers/racks around the datacenter until you get a balance. One datacenter I was working in had a large customer move out. They were extremely high density (80 servers per rack). When they moved out, I was in their old space trying to set it up for a new customer, I set my build sheet down on my ladder and the second I let go, it went flying away. The airflow from the ceiling vents was that rediculously strong. On the other side of the building, we had warm spots. We cranked down the jet and the hot spots went away.

5) (Holy crap this is a lot of typing) CRAC maintenance... Check your filters! Clogged filters = crap air flow. If only Dyson made CRAC units, they wouldn't lose suction due to clogged filters. Have your CRAC units properly inspected and maintained, make sure the fan is functioning properly, and make sure the system is adequately charged with freon, glycol, or whatever... Oh and for the love of god, clean up what is under the floor! Resistance is your worst nightmare.
Reply With Quote
  #4  
Old 10-26-2008, 01:09 PM
Neoeclectic Neoeclectic is offline
Member
 
Join Date: Oct 2008
Posts: 85
Default

Quote:
Originally Posted by Keith View Post
Wow Ken, that is quite the list!
2) Spacer Blanks... Should you use them? Absolutely! I am torn on whether or not to put a space between all of the servers (and having a blank panel) or having the servers stacked together with blanks on the remainder of the open space above and below the servers. Heres my internal debate...You stack the servers on top of each other without spacing in between and you have the possibility for thermal heat transfer. If you stack them with a 1u gap and 1u blank panel, the air will still accumulate (but not recirculate) in those voids. So the difference between the 2... Thermal transfer of course can be a concern... My bigger concern is the rebreathing that is common across all datacenters.
That's the only thing that I'm debating. I'm looking for best practices documentation that says pizza boxing is better than spacing the servers 1U apart.

Thus far I've found two different sites that say 1U spacer with or without a spacer blank is a no-no and that the correct way to configure a rack is to stack all servers on top of one another like pancakes. That's an issue for me because between the three data centers I've worked in none of them has ever done that. They all have always used 1U spacers between servers.

The problem with the documentation I found was that they're not written standards like the TIA 942. I personally believed that leaving a 1U spacer would help with heat more than anything else. It reduces the rack density and then prevents thermal transfer between servers. Also I use fully vented, mesh, doors. Meaning there may as well not be doors on them there's that much ventilation.

I switched to those when inside cabinet temperatures were reaching 90 degrees. I swapped those mildly ventilated doors for full mesh doors and the average temperature is now 75 degrees which help a ton giving the heat somewhere to go.

So I'm just looking for justification to either stack or space the servers in a rack.
Reply With Quote
  #5  
Old 10-26-2008, 01:56 PM
KenB's Avatar
KenB KenB is offline
Administrator
 
Join Date: Jan 2006
Location: Pittsburgh, PA
Posts: 468
Default

I'm not aware of thermal transfer between adjacent servers being a problem, but that doesn't mean it can't be. When there is no "right" answer, a good thing to do is list the pros and cons of each scheme and pick the one that works best for your site. For example, we often leave a gap between 1U servers to avoid high cable density, but we currently have ample space and cooling. So, although these are not issues for us, they may be for someone else.

Here is a wealth of information about servers and cooling that should help inform your decision:
APC - Product Information

Ken
Reply With Quote
  #6  
Old 10-27-2008, 09:43 PM
Neoeclectic Neoeclectic is offline
Member
 
Join Date: Oct 2008
Posts: 85
Default

Just a followup here.

APC has white papers and they basically say pizza boxing is the way to go. I found a couple of other sources that say pizza boxing is it. Even if you can only put 5 1U servers into a rack because of constraints that's the way to do it.

If you ask me that's a complete waste since they're expecting us to fill in the empty slots with spacer blanks. I also find it contradictory to some manfacturers saying to leave a 1U spacer because some 1U servers aren't "True 1U" devices.
Reply With Quote
  #7  
Old 10-30-2008, 02:20 PM
dcrelocation dcrelocation is offline
Member
 
Join Date: Feb 2008
Posts: 38
Default

There is usually no reason to have a space between server for Air Flow - All hardware made in the last few years are set to draw air (cold) from front of server and expel heat through the back.

You want to use spacers in RACK to keep the cold air in the cold aisle and Hot in the hot aisle - which is better efficiency for your cooling.
Reply With Quote
  #8  
Old 10-30-2008, 04:14 PM
attagirl attagirl is offline
Senior Member
 
Join Date: Oct 2008
Posts: 117
Default

Glad that I found this site. I was concerned with cooling on server racks and this information has provided me with what I need to make sure that I get enough air flow between the servers. Thank you very much.
Reply With Quote
  #9  
Old 10-31-2008, 06:08 PM
Neoeclectic Neoeclectic is offline
Member
 
Join Date: Oct 2008
Posts: 85
Default

Quote:
Originally Posted by dcrelocation View Post
There is usually no reason to have a space between server for Air Flow - All hardware made in the last few years are set to draw air (cold) from front of server and expel heat through the back.

You want to use spacers in RACK to keep the cold air in the cold aisle and Hot in the hot aisle - which is better efficiency for your cooling.
There are certain presumptions that are made as well. If we presumed that things like there were no problems with cable management, or that the data center was basically TIA-942 compliant then it makes complete sense.

But I'm running a legacy data center that doesn't have the luxury of being 942 compliant which creates some interesting challenges. So the point is that there is no cold/warm aisle configuration here and the warm air is already blasting into the intake of the row behind. So for us we gain very little anyway. Also most of the racks are running with two circuits of 120v power meaning we can't put a whole lot into this racks either for redundancy reasons. So I'm not going to stack 12 IBM x336 and then fill in the rest of the rack with spacer blanks. That's just an unreasonable cost and effort since I would need about 10,000 of them.

We checked the operating temp of the servers from bottom to top and there's only an average of a 8 degree difference topping off at 90 degrees with a less than 5% average for hardware failure annually which is inline with industry averages. Not a terrible thing I don't think.
Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 11:22 AM.

Member Area



Data Center Industry Daily News


Cloud and Dedicated Hosting


Sponsors Managed Servers Sponsored by DedicatedNOW.
Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2024, vBulletin Solutions, Inc.