Servers - General

# Powertrust voltage.

## Powertrust voltage.

Hello all,

I have a question about the new powertrust UPS. We have a little problem with the power we got delivered. It has to be 230-240 Volt, but it sometimes drops below 200 Volt. I jumpered the powertrust to 230 Volt, and he is working fine, even when the input-voltage drops below 200 Volt. Because of the importance of the machine's behind the UPS I'am not very pleased with this situation and I'am currently in a heavy discussion with the company which delivers the power. They say it is no problem at all, because he is working. But I look a bit further than they do, and I know this stuff will get older, and then what??? So I would like to know what the minimum input-voltage is for the Powertrust jumpered at 230 volt, with a load of app. 65 to 70 %??? I have RTFM all, but cannot find an answer to this particular question.

Sincerely
AJ Hettema
Believe me! The secret of reaping the greatest fruitfullness and the greatest enjoyment of life is to live dangerously.
2 REPLIES 2

## Re: Powertrust voltage.

AJ,

You did not specify which PowerTrust model you are using. However, I believe that this information will be helpful to you. I am going to reference a PowerTrust II LR, P/N A1356A (3000VA/2100W).

Since you are on 230v, the nominal input voltage is 230vac. On pg. 25 of the manual, the normal input voltage is specified at 180 to 264 vac. So, if you're above 180vac input, the UPS shouldn't trip on low voltage.

What I would be concerned about is the amount of energy required to run the server. It is possible to trip the UPS on power limitations, particularly for highly reactive (high power-factor) loads, even though you are operating within voltage specs. I'll try and keep the electronics lesson to a minimum .

Consider these formulas: Watts(in)=Watts(out)(Pi=Po), and Watts=Volts/Amps (P=E/I). There are some other variables here, like power-factor (to get the VA value) but we'll ignore them now for simplicity

Since you are running at 70% load, you are using: .7*2100W=1470W ('56A). Here comes the kicker. Since you are running the output at 230V and the input at 200V, the input current is higher than the output current. This is because the power entering the UPS must equal (with losses, exceed) the UPS output power. Plugging the numbers in: 1470W/230V=6.4A, we get the output current. The input current that we are concerned about is 1470W/200V which gives us 7.4A.

The efficiency is 87% or better, so we'll add that in: 7.4A/.87=8.5A.

Not much problem here, as the circuits are probably L6-20 (230V, 20Amp), but really you should only be using 6.4A/.87=7.4A.

Since the server is not entirely resistive, there is some reactive load that increases the VA (literally Volts*Amps) value, up to several amps can be added resulting in a VA value significantly higher than the Watts used. For a fully loaded UPS (perhaps during startups?) 3000VA/200volts=15amps. At the lower limit of 180vac, 3000VA requires 16.7amps! Pretty close to the 20amp circuit capacity, and limit of the UPS. Don't forget - we haven't considered battery charging current, etc.

Bottom line, tell the engineer to change the tap on his transformer and raise the input voltage to 230. The UPS is better used to protect against power problems, not to compensate for insufficient voltage.
New Member

## Re: Powertrust voltage.

I stumbled across this and realize that it is an old posting but just incase this problem still exists:

You first need to ask; where is this voltage drop occurring? If the utility is sagging this low, they should be brought in to evaluate the situation further and provide a solid explanation of why. I have yet to run across any facility manager that was willing to tolerate a 15-20% voltage drop on a daily basis.

Just so you know, most utility companies that have emergency power generation usually set the drop out at 85% for general equipment and 85-90% for the UPS equipment.