As instrumentation becomes more accurate at measuring voltages and currents, the statement salesmen of such instruments use to promote their products has become grossly inaccurate. We investigate if there is any truth behind the claim that 'True RMS' is the only measurement. We need to begin by accepting that things have changed from the good ol' days of the a power source that represented a sinewave and a load that closely resembled a resistor. Older instrumentation (the older moving coil meter is such a beast) would rectify the incoming signal, and then adjust the value (build in a factor) to represent the applied AC voltage or current. Nowadays it is not uncommon to find a source that is more like a square wave, and loads are a complex mixture of resistive, inductive, capacitive, and very non-linear. With such complex waveforms it is no longer possible to simply "rectify and apply a suitable factor", especially when the instrument does not know the factor involved! By using the whole waveform from zero-crossing to zero-crossing (I'm not going to go into the processes now), the instrument outputs a value that represents the true mean (equivalent DC value) of the applied input. RMS, or Root of the Mean of the Squares (you seldom see RMS in its full proper expanded form!), therefore undeniably has a place in the measurement of voltages and currents of source and load. RMS is therefore also a true means of determining the energy involved. It is not uncommon to hear people referring to RMS power (watts) being the "equivalent heating energy" of an "equivalent DC source being fed into an equivalent resistive load". This is completely true when dealing with simple loads such as resistive, inductive, and capacitive, and a relatively clean source i.e. very little harmonic content. Start involving hi-tech loads and the argument falls apart at high speed! You disregard the peak of the waveform at your peril. We deal with that now.
© 29.02.04 |