Page 1 of 1

Computer Power Usage

Posted: Wed Dec 05, 2012 5:46 am
by CWood
Hi guys, for one of my A Level qualifications (one below a degree, for everyone not in the UK), I need to write a report, part of which entails the power efficiency of different computers.

If possible, could people please take power readings of their computers, in as many as possible of the following configurations:
  • Running Windows, not logged in, idle
    Running Windows, logged in, idle
    Running Windows, logged in, running the Linpack stress test algorithm
    Running Linux, not logged in, idle
    Running Linux, logged in to X server, idle
    Running Linux, not running X, running Linpack
    Running Linux, running X, running Linpack
I would also appreciate it if people were to post hardware models, and OS versions/distros, as well. I need to investigate the efficiency of different models of computer, different types (incl. discrete logic chips, FPGAs, micros, etc.), and the effects of software/different operating systems.

I have done some preliminary research, but I'm at college at the moment, and don't have it with me, so I'll post it up here when I get home. I intend to make all research public domain, so if you don't want your details going public, let me know, and I will only include the numbers, and no actual details.

Re: Computer Power Usage

Posted: Wed Dec 05, 2012 8:41 am
by Brendan
Hi,
CWood wrote:Hi guys, for one of my A Level qualifications (one below a degree, for everyone not in the UK), I need to write a report, part of which entails the power efficiency of different computers.
For electricity, the normal formula is "power = voltage * current" (or if you prefer, "watts = volts * amps"). This formula is wrong.

More specifically, the formula is right, but only if voltage and current don't fluctuate. Mains power is AC - it fluctuates in a sine wave.

Imagine if the voltage rises up to 240 volts then back to zero volts (then back up to 240, etc in a cycle), and you're taking a burst of current when voltage is at 1 volt. In this case you might take 10 amps of current at 1 volt and consume 10 watts; and wouldn't consume 2400 watts.

For most pieces of equipment that uses AC, the correct formula is "power = voltage * current * correction_factor". The correction factor is actually called the power factor. Computers typically use switch mode power supplies which create all kinds of "power factor" problems.

Now; you want people to measure power consumption. Some of the people here might have voltmeters and ammeters, and (ignoring the risk of electrocution when playing with mains power) these meters aren't adequate to calculate power. You'd either need an actual power meter, or some way of measuring the power factor (and voltage and current) to calculate the power correctly. I severely doubt anyone else here has the equipment needed to measure power consumption properly (I don't, even though I do have some relatively expensive test equipment from when I was an electrical contractor/electrician).

Note: Mains voltage is typically rated at some specific nominal voltage. You can't use this nominal voltage rating in accurate calculations. For example, in Australia it's "240 volts", but in practice I've seen it anywhere between 190 volts up to 250 volts. There are reasons for this too (voltage drop caused by resistance in cables supplying power to the premises, where the actual voltage "at the wall" can depend on things like how much power all your neighbours are consuming).

There are 3 alternatives. The first alternative is to measure DC power coming out of the power supply (rather than AC power going into the power supply). This won't be total power consumption of the computer and will therefore be wrong/misleading, and would mean cutting several wires to insert an ammeter if you don't have a (more expensive and less common) clamp meter. This also means that you'd need several ammeters and/or need to do each of your 7 tests many times (e.g. once per "non-ground" wire coming out of the power supply). Most people won't have the equipment and/or won't like the idea of cutting all their power supply's wires and/or won't want to spend so much time (7 tests * 20 wires * 10 minutes per wire = 1400 minutes = almost 23.33 hours per computer).

Another alternative is to measure "kilowatt hours". This seems easy because (for most places) there's a "kilowatt hour" meter used for determining your electricity bill (which does measure actual power consumed and not "volts * amps", even though it might not be the most accurate test instrument you can get), and therefore lots of people will have the equipment needed. However; in this case you'd need to turn off everything else (to avoid getting wrong results because you left a fridge/freezer running); and because these meters are designed to measure bulk usage ("kilowatt hours" rather than "watt seconds") to get acceptable precision you'd probably need to run the computer for at least 2 days. For 7 different tests that might add up to 14 days without being able to make a cup of coffee.

The last alternative I can think of is to measure heat produced. That's not very practical either (thermometers are easy enough to get hold of, but you'd have to account for ambient temperature, volume, heat escaping, etc).

Of course then there's other problems. If different people use different methods to "measure" (estimate) power consumption; then you won't know if these measurements are comparable. If 2 people use the exact same method you won't know if their meters are calibrated correctly. Some of these tests may effect the results (for example, placing a cheap/simple ammeter in a circuit increases the resistance of that circuit and can reduce the power consumed).

To make sure results are at least slightly comparable; you'd want to choose one of those impractical methods of power measurement and ask everyone to use that same impractical method.


Cheers,

Brendan

Re: Computer Power Usage

Posted: Wed Dec 05, 2012 9:26 am
by CWood
WOW! Brendan, once again I'm floored at just how intelligent you actually are! With your permission, can I include some of that information in the report (with proper references, of course)?

And in actual fact, when I said I needed the power ratings, I only meant ballpark figures. For example, using my cheap energy monitor I got yesterday for exactly this purpose,
  • Linux (no X server) - 65-70W
    Linux + X server - 85-90W
Yet to do Linpack, or Windows testing, because I've stolen the power cable for something right now (never enough of those in my house... really should buy more.) I know these numbers aren't 100% accurate, but given that it should be accurate enough for my purposes, relative to the other results, enough to compare against at least, it should be fine (bear in mind, this isn't degree level, so it doesn't need to be 100%, so long as I can see a general trend).

Thanks for all the information though, massively insightful for me!

Re: Computer Power Usage

Posted: Wed Dec 05, 2012 12:17 pm
by Brendan
Hi,
CWood wrote:WOW! Brendan, once again I'm floored at just how intelligent you actually are! With your permission, can I include some of that information in the report (with proper references, of course)?
You can include that information, with or without references.. :)
CWood wrote:And in actual fact, when I said I needed the power ratings, I only meant ballpark figures. For example, using my cheap energy monitor I got yesterday for exactly this purpose,
  • Linux (no X server) - 65-70W
    Linux + X server - 85-90W
Ah - I didn't realise you only needed ballpark figures. In that case, I pulled some info from this computer's UPS.

Under almost no load (Apache, DHCPD, FTPD, etc doing nothing), at a bare command prompt I get:

Code: Select all

output.current: 1.10
output.frequency: 50.0
output.frequency.nominal: 50
output.powerfactor: 0.78
output.voltage: 246.0
That would work out to: 1.1 * 246 * 0.78 = 211.068 watts

Under almost no load (Apache, DHCPD, FTPD, etc doing nothing), but running X and KDE and a few other things (web browser, text editor, etc) I get:

Code: Select all

output.current: 1.30
output.frequency: 50.0
output.frequency.nominal: 50
output.powerfactor: 0.78
output.voltage: 246.0
That would work out to: 1.3 * 246 * 0.78 = 249.444 watts

There's also a 24-port ethernet switch and a pair of KVMs (and keyboard/mouse) connected to the same UPS. As a rough guess, I'd assume they'd use about 8 watts. This means the actual figures for the computer alone would be:
  • about 203 watts, idle at bare command prompt
  • about 241 watts, idle in GUI
My monitors were not included (no idea what sort of power they'd consume - maybe another 100 watts each).

This computer is a server/workstation type thing running Gentoo Linux. Hardware is:
  • a pair of Xeon E5520 CPUs (TDP is 80W each)
  • a pair of ATI video cards
  • a pair of WD hard drives (probably running)
  • a third "green" hard drive (probably asleep)
  • 12 GiB of ECC RAM
  • 2 network cards (one 100 MHz and one gigabit), both probably not doing much
  • Miscellania (power supply, chipset, some sound cards I don't use, etc)
If you assume that LINPACK uses as much CPU power as possible; then you could maybe guess that running LINPACK would add the CPU's TDP to the figures (or, a bit less than the CPU's TDP plus a bit more for power supply efficiency loss). That would give:
  • about 203 watts, idle at bare command prompt
  • about 241 watts, idle in GUI
  • about 363 watts, running at max. TDP at bare command prompt
  • about 401 watts, running at max. TDP in GUI
The main difference between "idle at command prompt" and "idle in GUI" is that for the former one video card isn't being used and the other is only doing 80*25 text mode, and for the latter both video cards are generating high resolution video (1920*1200 and 1600*1200 resolutions). From this you might be able to conclude that switching to a high resolution video mode costs about 15 watts (per video card). Of course 3D games (or maybe GPGPU usage?) would push video card power usage up a lot more.

For absolute minimum power consumption, you'd want to subtract a little for the 2 hard drives that probably weren't sleeping and a few other things. I'd estimate that power consumption as low as 175 watts is possible for this computer.

To get absolute maximum power consumption; you'd want both GPUs under load, plus all hard drives and both network cards (in addition to the CPUs themselves). I could imagine power consumption going up to maybe 650 watts in this case (still not including the monitors, keyboard, mouse, etc).


Cheers,

Brendan