A small exploration of a circuit which went nowhere…
I’ve been looking for a low-cost way to detect whether an electrical appliance is drawing current. There would be several uses for that, from power-saving master/slave power strips, to helping figuring out what’s going on in the house by measuring the central meter, as I’m doing already.
Here’s my original (naïve) line of thought:
- optocouplers are a great way to stay out of AC mains electrical trouble
- many units will still work with currents down to well under 1 mA
- all I want is detection – is a device ON or OFF?
- the rest can be done with JeeNodes, wirelessly, as usual
Here’s the (pretty silly) first step, in series between power outlet and appliance:
A resistor to limit the current, and the opto-coupler to “light up” and produce a signal in a galvanically isolated manner. But don’t get your hopes up, this setup definitely won’t work!
- First of all, the appliance will stop working, if the resistor is in the 100 kΩ range.
- With resistance values much lower, the resistor will become a heater and self-destruct.
- Besides, even a 100 W load will draw 0.5 A or so… way too much for the LED.
- There’s also a small problem with the voltage being AC, not DC, but that’s easily solved with a diode.
Sooo… could this approach be made to work?
In this case, an indirect demonstration that it can’t possibly work is actually easier to argue: suppose there were a circuit with the opto-coupler’s LED in series between outlet and load. It only needs to cause a voltage drop of about 1.5V to get the IR led inside to light up. Now suppose the load draws 1000 W, i.e. nearly 5 A. Big problem: (due to P = E x I) the power consumed by the opto-coupler would be 7.5 W … an excessive amount of heat for such a tiny device (apart from the sheer waste!).
Even some form of magical “power reduction” would be tricky, since I do need to detect the power-OFF case, i.e. power levels down to perhaps 0.5 W of standby power consumption for modern appliances (some even less). For 220V, 0.5 W corresponds to an average current of only 2.2 mA.
Maybe there’s a clever way to do this, but I haven’t found it. Something with a MOSFET, mostly conducting, but opening up just a little perhaps 1000x per second, to very briefly allow the LED in an opto-coupler to detect the current (with an inductor to throttle it). Heck, an ATtiny could be programmed to drive that MOSFET, but now it’s all becoming an active circuit, which needs power, several components, etc.
The trouble with AC mains is those extremes – on the one hand you need decent sensitivity, on the other you need to deal with peak loads of perhaps 2000 W, with their hefty currents, and where heat can become a major issue. We’re talking about power levels ranging over more than three orders of magnitude!
Of course there is the current transformer. But those are a bit too expensive to use all over the place.
Another option would be to use a regular transformer in reverse, as in this article. With a little 220 => 6.3V transformer, the ratio would be 35:1. With 5A max, a 0.1 Ω resistor would have 0.5 W of heat loss and create a 0.5V difference, amplified up to 17.5 V. For a 2.2 mA “off” current (0.5 W standby), that voltage would drop to 7.7 mV, which is probably too low to measure reliably with the ATmega ADC. Still.. might be worth a try.
An interesting low-cost sensor is based on the Hall-effect to detect magnetic fields. Because wherever there is alternating current, there is a magnetic field.
With hall-effect sensors, the problem is again at the low end. Keep in mind that the goal is reliable detection of current, not measurement. Now, you could increase the sensitivity by increasing the intensity of the magnetic field – i.e. by creating a coil around the sensor, for example. Trouble is: since those wires need to also handle large power consumption levels, they are by necessity fairly thick. Thick wires make lousy coils, and are impossible to get close to the sensor.
There are other ways. Such as measuring light levels, in case of a lamp. Or heat in case of a TV (would be a bit sluggish). Or vibration in case of a fridge.
Maybe a hand-made 100-turn current transformer would work? That’s a matter of winding some thin enameled wire around a single mains-carrying insulated copper wire.
Hm… might give that a try.
The challenge is to find something reasonably universal, which does not require a direct galvanic connection to AC mains. It seems so silly: an ATmega can detect nanowatt levels of energy, yet somehow there is no practical path from AC power ON/OFF detection to that ATmega?
PS. A related (and more difficult) problem is “stealing” some ambient magnetic energy to power a JeeNode. You don’t need much while it’s asleep, but transmitting even just one tiny packet is another story.