Originally posted by R.G.
View Post
Classic amps are full of WTF moments like these that would have "real" EEs beating their heads against a wall. In fact, even using vacuum tubes is a WTF moment for many of them. They're certainly noisier as a first stage than your average 50 cent op-amp or JFET.
Precisely why I pull all the cathode resistors out, and use fixed bias. Then, the preamp gain can be set at any level since there is no noise floor to contend with.
Its value is low, and the level of Johnson noise is proportional to the resistor value. (The square root of it to be precise)
It's bypassed by a capacitor that shorts the noise voltage. So the relevant resistance for the noise calculation is the capacitor's ESR. Or maybe the tube's internal cathode resistance of 1/gm, I wouldn't be surprised if that generated thermal noise just the same as a "real" resistor.
In high-gain amps I've built, it seemed that most of the thermal noise voltage came from the resistance of the guitar volume control. The hiss level would decrease as the volume was turned towards zero. Or maybe it was some internal noise voltage in the tube, feeding back out of its grid, that got shorted.
This is one reason why amps have shorting input jacks: so you can't hear the noise voltage of the first stage grid leak resistor when nothing is plugged in.
The absolute (Kelvin) temperature also appears in the equation for thermal noise voltage, so if you really want to get rid of noise, you could do a lot worse than dunking your first stage in liquid helium. People still do this in really high-performance preamps for radio telescopes and the like, but so far nobody has got Eric Johnson interested.
Comment