One thing I have always seen on neowin is most the of the young people pour so much OVERKILL into anything they do with computers. They do not realize they don't need most (if any at all) of their "projected" extra measures.
I think part of the issue stems from things like processor "TDP" and such, where it'll say 150 watts...people who are ignorant to how things REALLY work think "oh i must need 150 watts for my processor, plus 150-300 (box specs for a mid level graphics card) + 10 for each drive...etc Pretty soon for a processor + mb + ram + 2 hard drives and 1 video card, they're thinking they need 600-1000 watts L
In this case, more is not better, as PSU's have somewhat of a bell-shaped curve to efficiency. anything <> 40-60% and it's less than optimally efficient. By how much is rather slim, usually only a few percent. While most desktops don't sit running 24x7, servers do, and 2-3% of even 100 watts =$ you're wasting. Say you save 2% on a server requiring 150 watts. That's 3 watts. 3W x 24 hours = 72 Watts a day / 1000 watts = 0.072kWh. In my area, 1kWh = $0.12, so that = 0.00874$. Not even a penny, right? However this is a server, so it's on 24x7x365. Multiply 0.00874$ x 365 and you get $3.15
So you save 3$ a year. Not much, but why waste money if you don't have to. These numbers were generous just to prove an example that for servers you should get exactly what you need, nothing more or less. Invest that in a mutual fund or something for 5 years and you'd multiply your 3$ by 650% to 19.51.
Ehh..enough math geekiness.. I think the point is made