To find out the real deal, I decided to determine current using the Sanwa CD771 multimeter. I placed a 10-ohm, 5%, 1-Watt resistor in series with the bulb (the lamp socket, that is, in my test rig) via a terminal block onto which I screwed in the resistor leads and the wires to make a good and safe connection. The ohmmeter measured the actual resistance to be 9.5 ohms.
I would've wanted a smaller resistor value so that it would be insignificant compared to the lamps' filament resistance. Unfortunately I don't have any resistor smaller than 10 ohms and I don't have any other 10-ohm resistor (which I could connect in parallel). So I had to make do with this one resistor even if according to my calculations it would burn out if I used a 100-Watt lamp in the circuit: [(100W/220V)2]10Ω = 2.1Watts.
What's with the resistor anyway? I want to find out the actual current going through the circuit. To accomplish that I would measure the voltage drop across the resistor. It would then be trivial to compute for the current through the resistor and, because this is a series circuit, that would be the same current through the lamp as well.
The results are in the table below. Measured voltage of the mains at the time of testing was 232 Volts. This is the value used for calculations in the last column.
Lamp Type | Clamp meter reading (without resistor) | Voltage Drop across Resistor (V) | Computed Current (A) | Computed Total Power Dissipation (W) |
Philips Softone 60Watts | 0.0 | 2.59 | 0.27 | 62.6 |
Philips Spotline R80 60Watts | 0.0 | 2.57 | 0.27 | 62.6 |
Philips Superlux 100Watts | 0.1 | 4.31 | 0.45 | 104.4 |
As can be seen above, the lamps are in fact very close to their specified output and it was the clamp meter that was way, way off the mark. But why? I read the manual again and it turns out that at 50-60Hz the meter's accuracy is +/-(2% + 5 counts) and at 60-500Hz it's +/-(2.9% + 5 counts). These are for currents <200A.
Since the reading can be off by 5 digits (in this case that translates to +/- 0.5A), this clamp meter is absolutely useless for measuring currents less than 1A. Even for a current of 1A this meter will show anywhere from 0.4 to 1.5A. Thus the reading can be off from the true value by -60% to +50%. For 10A, reading will be in the range of 9.2 to 10.7A, which translates to -8% to +7% accuracy. Tolerable. In the 200A range, the meter reaches its highest accuracy of +/- 2 to 2.9% when measuring currents just below 200A. At over 100A the +/-0.5A (which is what the +/- 5 counts means in this range) becomes negligible.
So now I must remind myself that with this particular meter I have to take with a grain of salt any reading below 10A. Given the 220VAC mains voltage here, finding out the current draw of a load less than 2000W simply isn't worth checking with this meter.
It's always a puzzle when our tools don't quite match up with expectations. Have you considered trying out a Fluke 323 True RMS clamp meter for more accurate readings? It might be a game-changer for situations like these.
ReplyDelete