I have constantly read - when referring to cathode bias of the power amp - that often the biasing "should" (or perhaps I should say "can") be set hotter at idle because when being driven the bias will shift cooler and so to avoid crossover distortion the idle setting can be exempt from the so called 70% rule and be set more like 80% or hotter.
What I don't understand is why this is NEVER the case in any cathode bias amp I build! In every one, and I've built quite a few, I test as follows: I run the amp into a matched resistive load to save my hearing, I run a 1 khz signal into the input, and then I constantly measure the plate voltage and the cathode voltage as I increase the volume from 0 to max. When running the numbers, EVERY TIME, the calculated dissipation *increases* despite the amp B+ sag that obviously occurs. At no point does the bias ever shift towards a lower percentage of dissipation. So, what I usually do is set up the idle bias based upon the max dissipation of whatever tubes I'm using so that when cranked to full bore they climb to max dissipation. Actually max is usually a touch under full bore on the volume.
Am I missing something here?
What I don't understand is why this is NEVER the case in any cathode bias amp I build! In every one, and I've built quite a few, I test as follows: I run the amp into a matched resistive load to save my hearing, I run a 1 khz signal into the input, and then I constantly measure the plate voltage and the cathode voltage as I increase the volume from 0 to max. When running the numbers, EVERY TIME, the calculated dissipation *increases* despite the amp B+ sag that obviously occurs. At no point does the bias ever shift towards a lower percentage of dissipation. So, what I usually do is set up the idle bias based upon the max dissipation of whatever tubes I'm using so that when cranked to full bore they climb to max dissipation. Actually max is usually a touch under full bore on the volume.
Am I missing something here?
Comment