Hi,
CWood wrote:Hi guys, for one of my A Level qualifications (one below a degree, for everyone not in the UK), I need to write a report, part of which entails the power efficiency of different computers.
For electricity, the normal formula is "power = voltage * current" (or if you prefer, "watts = volts * amps"). This formula is wrong.
More specifically, the formula is right, but only if voltage and current don't fluctuate. Mains power is AC - it fluctuates in a sine wave.
Imagine if the voltage rises up to 240 volts then back to zero volts (then back up to 240, etc in a cycle), and you're taking a burst of current when voltage is at 1 volt. In this case you might take 10 amps of current at 1 volt and consume 10 watts; and wouldn't consume 2400 watts.
For most pieces of equipment that uses AC, the correct formula is "power = voltage * current * correction_factor". The correction factor is actually called the
power factor. Computers typically use switch mode power supplies which create all kinds of "power factor" problems.
Now; you want people to measure power consumption. Some of the people here might have voltmeters and ammeters, and (ignoring the risk of electrocution when playing with mains power) these meters aren't adequate to calculate power. You'd either need an actual power meter, or some way of measuring the power factor (and voltage and current) to calculate the power correctly. I severely doubt anyone else here has the equipment needed to measure power consumption properly (I don't, even though I do have some relatively expensive test equipment from when I was an electrical contractor/electrician).
Note: Mains voltage is typically rated at some specific nominal voltage. You can't use this nominal voltage rating in accurate calculations. For example, in Australia it's "240 volts", but in practice I've seen it anywhere between 190 volts up to 250 volts. There are reasons for this too (voltage drop caused by resistance in cables supplying power to the premises, where the actual voltage "at the wall" can depend on things like how much power all your neighbours are consuming).
There are 3 alternatives. The first alternative is to measure DC power coming out of the power supply (rather than AC power going into the power supply). This won't be total power consumption of the computer and will therefore be wrong/misleading, and would mean cutting several wires to insert an ammeter if you don't have a (more expensive and less common)
clamp meter. This also means that you'd need several ammeters and/or need to do each of your 7 tests many times (e.g. once per "non-ground" wire coming out of the power supply). Most people won't have the equipment and/or won't like the idea of cutting all their power supply's wires and/or won't want to spend so much time (7 tests * 20 wires * 10 minutes per wire = 1400 minutes = almost 23.33 hours per computer).
Another alternative is to measure "kilowatt hours". This seems easy because (for most places) there's a "kilowatt hour" meter used for determining your electricity bill (which does measure actual power consumed and not "volts * amps", even though it might not be the most accurate test instrument you can get), and therefore lots of people will have the equipment needed. However; in this case you'd need to turn off everything else (to avoid getting wrong results because you left a fridge/freezer running); and because these meters are designed to measure bulk usage ("kilowatt hours" rather than "watt seconds") to get acceptable precision you'd probably need to run the computer for at least 2 days. For 7 different tests that might add up to 14 days without being able to make a cup of coffee.
The last alternative I can think of is to measure heat produced. That's not very practical either (thermometers are easy enough to get hold of, but you'd have to account for ambient temperature, volume, heat escaping, etc).
Of course then there's other problems. If different people use different methods to "measure" (estimate) power consumption; then you won't know if these measurements are comparable. If 2 people use the exact same method you won't know if their meters are calibrated correctly. Some of these tests may effect the results (for example, placing a cheap/simple ammeter in a circuit increases the resistance of that circuit and can reduce the power consumed).
To make sure results are at least slightly comparable; you'd want to choose one of those impractical methods of power measurement and ask everyone to use that same impractical method.
Cheers,
Brendan