VOLTAGE REDUCTION PART 1: THE SHORT ANSWER
The following three guidelines apply to everything from heating and lighting to motive power:
1. If the equipment is regulated in any manner, don’t expect voltage reduction to save energy.
2. If it is unregulated and you don’t mind reduced output, voltage reduction will save energy.
3. If it is a thermal application used on an intermittent cycle, voltage reduction will have a perverse effect, increasing energy consumption.
VOLTAGE REDUCTION PART 2: A MORE DETAILED DISCUSSION
I will concede at the outset that if you have an electrical appliance with fixed resistance, then the power that it draws will vary with the square of the applied voltage, so that for example reducing the voltage by 5% will drop the power by nearly 10% (95% x 95% = 90.25%). But power is not the same as energy, and not every load is simply resistive.
Let’s take the distinction between power and energy first, and consider the case of an electric heater (a classic purely resistive device). If the heater is unregulated—running all the time regardless—then yes, reduced power consumption equates to reduced energy demand (fewer kilowatts multiplied by the same number of hours). But if the heater is thermostatically controlled, it will run for longer at lower power in order to deliver the heat output required to balance the heat demand. Result: no energy saving. More generally, whenever the output of the equipment in question is regulated, no savings should be expected because electrical energy input in such cases is dictated by the required energy output, be it heat from heaters or mechanical energy from motors. Conversely, if you do see a reduction in electrical energy input, it has to be because you have accepted reduced output.
You will sometimes see claims that voltage reduction reduces the energy consumed in motors. Actually it works only with some motors, specifically, those whose output is unregulated. Toilet extract fans would be a good example: at lower voltage they slow down and draw less air through. The same, however, is not true of a motor-driven system where there is a variable-speed drive (VSD) regulating mechanical power output. The VSD will increase the motor’s supply frequency to keep its rotor turning at whatever speed is demanded by the driven equipment, delivering the required mechanical power. And to be fair, most voltage-reduction vendors recognise this. But VSDs are not the only way that motors can be regulated. Suppose you had a booster pump delivering water to the top of a tall building. The load on its motor will be dictated by water demand, and as the motor does nothing more than convert electrical energy into mechanical energy (and a bit of heat), its electrical energy input must balance its mechanical energy output plus thermal losses. If it does draw less power, it will deliver water at a lower rate and have to run longer to make up the shortfall. Refrigeration compressor? Air compressor? Same logic. Lots of motor-driven systems have regulated mechanical outputs (and hence defined electricity inputs), not just those with variable-speed drives. So no saving in any of these cases, unless the losses in motors are reduced by running at lower voltage. But firstly those losses themselves are very small (limiting the impact of reducing them) and in any case they could well increase rather than decrease at lower voltage. This is because to deliver the same power at reduced voltage a motor must draw more current. This will increase what are known as the ‘copper’ losses (resistive losses in the windings) and although counteracted by reduced ‘iron’ losses (eddy currents in the magnetic circuits) the truth is that the sum of all the losses is a minimum at the motor’s rated voltage and operating away from its nominal voltage—higher or lower—will marginally increase the electrical energy required to deliver the required mechanical energy output.
The other type of ‘regulated’ load is one which will tolerate a wide range of supply voltages. The notebook computer I am using works equally well at 100 or 240 volts because its power supply delivers the same output voltage regardless. It is therefore going to be relatively insensitive to variation in mains voltage, and the same is true of other kit like fluorescent lights with electronic control gear.
Which brings me to my final category: intermittent thermal loads, of which the domestic kettle is a fine (if trivial) example. When you put electrical energy into a kettle, part of it goes into raising the temperature of the water, and part is lost as heat from the kettle’s surface. I’ll ignore what happens after you reach boiling point because I am going to focus on the heating-up phase during which the surface temperature of the kettle rises from say 10 to 100 degrees. Simplifying a lot, let’s say the average temperature of the surface will be 55 degrees regardless how long the process takes. That implies the same average heat flux from the surface, which in turn means the longer the heat-up cycle, the greater the aggregate heat loss and hence—oh dear—the greater the energy input. OK, kettles may not be a significant energy use, but if you have electric catering equipment, or a pottery kiln, or heat-treatment furnaces, or anything else whose warm-up time you would like to minimise, raising the voltage will be more energy-efficient than lowering it (hence my objection to the term “optimisation”).
For all these reasons it is essential to have a thorough load survey to determine what mix of responses you are likely to get (reduction, neutral or increase), always bearing in mind your future plans. Savings from yesterday’s voltage reduction could be slashed by tomorrow’s lighting project.