I could not think of a more descriptive title, but I actually have a serious question about the power ratings of non-instrument amplifiers such as PA or home stereo systems.
In the very old days I didn't know much about the dynamic range of music so I assumed that just like the guitar amps I am a lot more familiar with, any amp could continuously supply its rated power.
Later I learned that most music (other than super compressed top-10 pop music) has transient peaks exceeding the average level by about 20dB or a factor of 100, and I assumed that this would explain the shady PMPO ratings that most consumer grade audio equipment comes with. Wouldn't it make sense for a boom box to supply an average power of maybe 5W and pump out transients at 500W?
But then I came across the divine musings (sorry steve ) of Rod Elliott, where I learned that PMPO is complete bogus and the rated RMS power is actually not some thermally limited average power, but the maximum power that can be obtained in a pinch by connecting the power supply rails to an 8Ohm resistor.
OK, so much for the short intro, now my actual question:
Wouldn't it make a lot of sense to build amplifiers that have a lot of "dynamic headroom", i.e. that can supply a lot more REAL power than the rated average power? I think about it this way: MOSFET Semiconductors are usually thermally limited in peak current, and I don't see why it should be different for bipolar devices, so it would not be a problem to ask a 5W amplifier to supply 500W for a few miliseconds. The power supply would also not be a problem, since even a flimsy wall-wart with 10VA can easily supply 100A or more given a reasonably sized filtering cap.
The only "problem" that I see with this scheme is that the rail voltages would have to be increased by an obscene amount. To pump 500W into an 8Ohm load, a 65V supply rail is needed, and that's a lot more than the 12V rails a common 5W boom-box will have. This voltage will have to be dropped in the active devices under normal listening conditions, so the total dissipation at 5W listening level (800mA into 8 Ohm) will go up from
12V * .8A = 10W (of which 5W are waste heat)
to something like
65V * .8A = 52W (of which 47W are waste heat)
That is a lot of power to be sinked, and I reluctantly agree that the average boom box will not have the capacity for this. But still this means that an amp with a 100VA supply and a heatsinking capacity of a 100W amp could produce transients of 500W, something like a 7dB headroom gain!
Seriously, why would any "500W amplifier" ever need a power supply of more than 100VA if no program material with less than 15dB of dynamic range is reproduced and the amp is not driven into clipping?
Oh, and the real killer, but I am not even getting into that, would be some kind of bi-amping / rail switching / Class G arrangement where the "500W amplifier" is only ever fired up when a transient approaches. This way a 500W transient from a 10VA wall-wart could probably be made reality.
In the very old days I didn't know much about the dynamic range of music so I assumed that just like the guitar amps I am a lot more familiar with, any amp could continuously supply its rated power.
Later I learned that most music (other than super compressed top-10 pop music) has transient peaks exceeding the average level by about 20dB or a factor of 100, and I assumed that this would explain the shady PMPO ratings that most consumer grade audio equipment comes with. Wouldn't it make sense for a boom box to supply an average power of maybe 5W and pump out transients at 500W?
But then I came across the divine musings (sorry steve ) of Rod Elliott, where I learned that PMPO is complete bogus and the rated RMS power is actually not some thermally limited average power, but the maximum power that can be obtained in a pinch by connecting the power supply rails to an 8Ohm resistor.
OK, so much for the short intro, now my actual question:
Wouldn't it make a lot of sense to build amplifiers that have a lot of "dynamic headroom", i.e. that can supply a lot more REAL power than the rated average power? I think about it this way: MOSFET Semiconductors are usually thermally limited in peak current, and I don't see why it should be different for bipolar devices, so it would not be a problem to ask a 5W amplifier to supply 500W for a few miliseconds. The power supply would also not be a problem, since even a flimsy wall-wart with 10VA can easily supply 100A or more given a reasonably sized filtering cap.
The only "problem" that I see with this scheme is that the rail voltages would have to be increased by an obscene amount. To pump 500W into an 8Ohm load, a 65V supply rail is needed, and that's a lot more than the 12V rails a common 5W boom-box will have. This voltage will have to be dropped in the active devices under normal listening conditions, so the total dissipation at 5W listening level (800mA into 8 Ohm) will go up from
12V * .8A = 10W (of which 5W are waste heat)
to something like
65V * .8A = 52W (of which 47W are waste heat)
That is a lot of power to be sinked, and I reluctantly agree that the average boom box will not have the capacity for this. But still this means that an amp with a 100VA supply and a heatsinking capacity of a 100W amp could produce transients of 500W, something like a 7dB headroom gain!
Seriously, why would any "500W amplifier" ever need a power supply of more than 100VA if no program material with less than 15dB of dynamic range is reproduced and the amp is not driven into clipping?
Oh, and the real killer, but I am not even getting into that, would be some kind of bi-amping / rail switching / Class G arrangement where the "500W amplifier" is only ever fired up when a transient approaches. This way a 500W transient from a 10VA wall-wart could probably be made reality.
Comment