I’ve been reviewing a lot of subwoofers lately and for almost all of them the manufacturer quotes two power specifications for their built-in amplifiers: the continuous output power (sometimes erroneously labelled ‘RMS’), and ‘dynamic peak’ or similar words. The latter is typically double or more of the former. Some claim that this is a truer representation of what a subwoofer amplifier will deliver in the usual course of business. This is a reasonable argument, or would be if we had some sense of how it was actually measured, and whether it is measured the same across the industry.
Unfortunately none of the manuals actually gave this information. Except, tonight I’m looking at a JBL sub/sat system and its manual actually describes the measurement technique. I quote:
The Peak Dynamic Power is measured by recording the highest center-to-peak voltage measured across the output of a resistive load equal to the minimum impedance of the transducer, using a 50Hz sine wave burst, 3 cycles on, 17 cycles off.
A part of this is reasonable. The 3-on, 17-off bursts give the power supply capacitors a chance to charge up again and may well come close to something like real-world conditions.
But what’s this about ‘center-to-peak’? When you’re doing power output measurements, you do indeed measure the voltage, and the centre to peak is the easiest to measure on a CRO. You couldn’t really use an RMS voltmeter because the offs would be averaged with the ons.
But if you use a centre to peak measurement, you then divide by the square root of two, or multiply by sin(45°), which is the same thing, because for a clean sine wave this gives you the RMS value of the voltage. Omitting this step means overstating the measurement by 41%.
Then there is the matter of choosing the lowest impedance in the loudspeaker’s operating band (I assume they’re talking about the operating band!). To calculate power from voltage you square the voltage and divide it by the load resistance (let’s forget about the impedance, which complicates matters). Let’s say the average resistance of the driver across its operating frequencies is 8 ohms. It would not be unusual for the driver to drop to 4 ohms at some frequency or other. By using this 4 ohms value you are increasing the power figure significantly (not doubling it, because the maximum undistorted voltage is generally lower into a lower impedance than it is into a higher impedance, but the drop-off is not proportional to the reduction in impedance so there is still some further increase.)
I hasten to add that I am not chastising JBL here. In fact, I am congratulating it for its openness about this, and I wish that other companies would follow its example. But I do doubt that this measurement actually yields much more useful information than continuous power output.