Data Center, Colocation, Cloud Computing, Storage, Dedicated Servers Forums

Data Center, Colocation, Cloud Computing, Storage, Dedicated Servers Forums (http://www.datacentertalk.com/forum/index.php)
-   Data Center Design, Development, Building Systems and Operations (http://www.datacentertalk.com/forum/forumdisplay.php?f=37)
-   -   Real power consumption of a server ? (http://www.datacentertalk.com/forum/showthread.php?t=18875)

darth vader 04-17-2009 07:57 AM

Real power consumption of a server ?
 
When I look at some datasheets of servers they often express the power consumption in Watt. (eg 300 Watt.)

The datasheet also gives me an indication of the current drawn (eg 4A).

4A x 230V gives me 920 Watt which is much higher then the 300 Watt.:confused:


I suppose the 4A is a current which 'can' be drawn (during startup ?) and that I have to consider the 300 Watt for sizing an UPS?

cernst 04-21-2009 05:15 PM

what piece of equipment are you looking at? max current may relate to a 120v system, in which case your draw would be 480w, much closer to 300w.

There is also something called "power factor". I think relates to the efficiency of electrical equipment. So if you have a power meter and it shows 2amps on 120v (240 watt) with a power factor of .9, then your true power consumption is really 267watt...someone please correct me if I'm wrong. I believe my truth factor may be at .8 right about now! Most higher end servers have a very high power factor. highest I've seen in my data center was .98 on a rack of HP servers.

post up your equipment and we may be able to help a bit more.

darth vader 04-22-2009 05:45 AM

I'm considering 230V (Europe).

You have a point: Watt is not Volt x Ampere.

But if the datasheet gives me a consumption of 300 Watt, I would assume 300/0.8 = 375 VA which is not a big difference.


This is a real example of a device we are going to install:
The datasheets gives me this info:

"Input voltage: 100 to 240VAC,
47 to 63 HZ, 3A
Output Power: 250W."


3Ax240VAC= 720 VA whichis much higher than the 250 W:confused:

whitey 04-22-2009 12:15 PM

Think of power by Watts as Amps is a variable dependent on the Voltage. If you are looking to see what the draw in Amps will be just use this formula: Watts/Volts = Amps

cernst 04-22-2009 06:01 PM

Quote:

Originally Posted by darth vader (Post 27001)
I'm considering 230V (Europe).

You have a point: Watt is not Volt x Ampere.

But if the datasheet gives me a consumption of 300 Watt, I would assume 300/0.8 = 375 VA which is not a big difference.


This is a real example of a device we are going to install:
The datasheets gives me this info:

"Input voltage: 100 to 240VAC,
47 to 63 HZ, 3A
Output Power: 250W."


3Ax240VAC= 720 VA whichis much higher than the 250 W:confused:

okay, so it'll pull 3 amps when running at 100 volts...300 watt. In the US, most commercial 200+ volts is setup in a Wye setup, which generally comes out around 208v. With 208v, you'll be pulling about 1.4 amps.

If you are looking to size a UPS to a load, look at your watts. Most UPS manufacturers will post the load rating in volt-amps and/or watts to make it easier on the user. For example, looking at APC's 750va lineup, we have the SUA750 and the SUA750RM2U. Both are 750va UPS units. Both use the exact same batteries (just in different physical configuration). But looking at the tech specs, the SUA750 allows for 20 extra watts over the SUA750RM2U.
APC Smart-UPS 750VA USB & Serial 120V
APC Smart-UPS 750VA USB & Serial RM 2U 120V


All times are GMT. The time now is 07:28 PM.

Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2024, vBulletin Solutions, Inc.