Originally posted by Mike Sulzer
View Post
Oh, and not to mention Peavey's idea was somewhat flawed from the start: Bandwidth reduction they discuss should apparently be a dynamic effect (as it's supposed to interact with signal voltage / frequency) . But the old Peavey "Saturation" scheme simply introduced a -fixed- amount of bandwidth reduction, ganged to the gain control dial. Effectively bandwidth is higher at lower gains, and reduced at higher gains, but there is no dynamic interaction at all whatsoever. Peavey's designers could have easily made the effect dynamically variable but for some reason they never did... Perhaps they found out that the dynamic effects had minor importance in comparison to major effect of bandwidth reduction before distortion alone.
Misunderstanding transformers... Well, I could partially agree on that. But Peavey designers should know better so I'm pretty sure they just chose a marketing approach and threw in one of those vaguely understood concepts that happens to sounds "sexy".... WOoooo.... "Transformer Saturation"..... Yet majority of people reading that article probably have no clue what that actually means in practice nor have even encountered it.
Anyone who has actually viewed output transformer saturation in an oscilloscope knows damn well it's one of those things that should be avoided almost at all costs. Distortion harmonics concernerd OT saturation is almost like very, very hard clipping, but instead of chopping off waveform tops random parts of the waveform are chopped off instead, and the signal gets manifestested with plenty of very high order distortion, odd and even. It will sound about as bad as poorly designed current limiter kicking in in a SS amp. Terrible! Avoid!
Luckily, avoiding is rather easy. It's pretty hard to saturate generic OT's with typical signal voltages, especially given typical signal frequencies. As said, you truly want to avoid OT saturation so this is no biggie, except maybe in bass amps that need to handle signals lower than about 80 Hz in frequency...
Just for the sake of reference, a typical modern high gain guitar amp may introduce a hi-pass filter centered around as high as 1 KHz (!!!!) just to avoid IMD. This pretty much even explains the key point of Peavey's old "Saturation" scheme (basically bandwidth reduction does marvels for IMD) but also makes existence of output transformer saturation quite dubious, given overall signal frequencies at the point of overdrive.
Case in point:
If the amp can put out an 80 Hz sine wave with nearly the rated power, then it appears that the relative power at about 80 Hz in these signals is too small to cause significant saturation.
Above is an oscilloscope capture illustrating OT saturation at 25 Hz signal with two different input levels. Bottom plot is the one with higher magnitude input signal, saturation correspondingly more severe. Note that @ 100Hz the reproduction would still be "clean". Yes, they are supposed to be sinusoidal waveforms but they happen to have this supposedly euphonic distortion from transformer saturation. You can just picture the harmonic distortion of this mess. Just think of the harmonic overtones created in comparison to generic soft clipping or even generic hard clipping. This will sound truly hideous, if not totally unusable in almost any musical context. And intermodulation will only turn things much, much worse.
For obvious reasons you can't really observe a potent trend to use underrated cr*ppy little output transformers... for that marvellous saturation tone. In fact, most people seem to praise transformers that are rather "transparent": low distortion, decent bandwidth and no saturation at typical signal frequencies / voltages. Take a vintage Gibson amp... People won't be hyping how great tone their pathetically tiny cheap output transformers have, more likely they'll be replacing the poor transformers with better ones. To improve tone.
There is also another explanation for bandwidth reduction during ovedrive that has nothing to do with transformer saturation: Simply, utilisation of negative feedback will increase bandwidth at the cost of gain reduction. If negative feedback loop becomes impaired because of clipping distortion the effect of bandwidth increase also becomes impaired. Basically, global NFB of the power amp stage can thus extent bandwidth of transformer coupling (basically even bandwidth of all kinds of circuits), but the effect is lost when the amplifier begins to clipping distort and global feedback loop is rendered ineffective. Unlike in true OT saturation, with this scheme the entire bandwidth (both lows and highs) actually is reduced and the distortion is also more "earpleasing" because it's not that of a saturating OT but that of the circuit just clipping in usual manner.
Comment