I could find little info on slew induced distortion (SID) in tube amps.
It's typically discussed in the context of opamps.
SID tends to change the shape of a sine signal into a triangle above some frequency and voltage amplitude.
Lowering frequency or amplitude gradually restores the sine shape.
(Any change of a sine signal means non-linear distortion and thus added harmonics - as opposed to the effect of linear filters, which don't change the sine waveshape.)
That's exactly what I'm seeing with this amp: https://docs.google.com/viewer?a=v&p...ZmMjg4ZWExYjE3
At higher output and above around 3.5kHz a sine signal starts to morph into a triangle.
The reason seems to be the 2nF caps wired between the power tube grid and ground.
Assuming a PI source impedance of around 20k, the caps will cause a LP filter with a 4kHz corner frequency.
Means that open loop gain drops by 6dB/octave above 4kHz.
But NFB counteracts, so measured closed loop -3dB frequency is about 9kHz.
All this can't explain the observed sine distortion.
But there's another effect caused by these caps:
Changing the voltage across a cap requires a current. The faster the rate of change (dV/dt), the higher the required current.
The dV/dt of a sine is highest at the zero crossings.
The current has to be delivered by the PI, but by design there's a peak current limit.
A 3.5kHz sine voltage having a peak value of 35V across a 2nF cap has a max. dV/dt of about 0.8V/µs and requires a peak current of 1.5mA.
I estimate the max. peak current the PI can deliver to be around 2mA and a considerable part of that will be absorbed by the 47k grid leaks.
Current compression is likely to start at lower values.
The current limiting results in a max. possible dV/dt across the caps, and that is the slew rate limit (as the slew rate scales with voltage level, the slew rate at the amp's output is lower than the slew rate at the power tube grids).
Signals that exceed that slew rate get distorted.
As cap voltage and current are out-of-phase by 90°, the distortion starts at the zero crossings of the voltage, where current is max.
NFB cannot increase amp slew rate.
Slew rate limiting is not a frequency response effect.
It is not caused by a limited frequency response (which doesn't cause distortion) but rather by the inability of some internal stage to charge a load capacitance fast enough.
Lifting the caps considerably improves sine wave reproduction.
I think Fender used the caps as a brute force means to avoid any risk of stability issues.
There are certainly more elegant/less invasive methods to achieve that goal.
I'll see what it takes.
It's typically discussed in the context of opamps.
SID tends to change the shape of a sine signal into a triangle above some frequency and voltage amplitude.
Lowering frequency or amplitude gradually restores the sine shape.
(Any change of a sine signal means non-linear distortion and thus added harmonics - as opposed to the effect of linear filters, which don't change the sine waveshape.)
That's exactly what I'm seeing with this amp: https://docs.google.com/viewer?a=v&p...ZmMjg4ZWExYjE3
At higher output and above around 3.5kHz a sine signal starts to morph into a triangle.
The reason seems to be the 2nF caps wired between the power tube grid and ground.
Assuming a PI source impedance of around 20k, the caps will cause a LP filter with a 4kHz corner frequency.
Means that open loop gain drops by 6dB/octave above 4kHz.
But NFB counteracts, so measured closed loop -3dB frequency is about 9kHz.
All this can't explain the observed sine distortion.
But there's another effect caused by these caps:
Changing the voltage across a cap requires a current. The faster the rate of change (dV/dt), the higher the required current.
The dV/dt of a sine is highest at the zero crossings.
The current has to be delivered by the PI, but by design there's a peak current limit.
A 3.5kHz sine voltage having a peak value of 35V across a 2nF cap has a max. dV/dt of about 0.8V/µs and requires a peak current of 1.5mA.
I estimate the max. peak current the PI can deliver to be around 2mA and a considerable part of that will be absorbed by the 47k grid leaks.
Current compression is likely to start at lower values.
The current limiting results in a max. possible dV/dt across the caps, and that is the slew rate limit (as the slew rate scales with voltage level, the slew rate at the amp's output is lower than the slew rate at the power tube grids).
Signals that exceed that slew rate get distorted.
As cap voltage and current are out-of-phase by 90°, the distortion starts at the zero crossings of the voltage, where current is max.
NFB cannot increase amp slew rate.
Slew rate limiting is not a frequency response effect.
It is not caused by a limited frequency response (which doesn't cause distortion) but rather by the inability of some internal stage to charge a load capacitance fast enough.
Lifting the caps considerably improves sine wave reproduction.
I think Fender used the caps as a brute force means to avoid any risk of stability issues.
There are certainly more elegant/less invasive methods to achieve that goal.
I'll see what it takes.
Comment