Ad Widget

Collapse

Announcement

Collapse
No announcement yet.

Cathode bias shift

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Cathode bias shift

    I have constantly read - when referring to cathode bias of the power amp - that often the biasing "should" (or perhaps I should say "can") be set hotter at idle because when being driven the bias will shift cooler and so to avoid crossover distortion the idle setting can be exempt from the so called 70% rule and be set more like 80% or hotter.

    What I don't understand is why this is NEVER the case in any cathode bias amp I build! In every one, and I've built quite a few, I test as follows: I run the amp into a matched resistive load to save my hearing, I run a 1 khz signal into the input, and then I constantly measure the plate voltage and the cathode voltage as I increase the volume from 0 to max. When running the numbers, EVERY TIME, the calculated dissipation *increases* despite the amp B+ sag that obviously occurs. At no point does the bias ever shift towards a lower percentage of dissipation. So, what I usually do is set up the idle bias based upon the max dissipation of whatever tubes I'm using so that when cranked to full bore they climb to max dissipation. Actually max is usually a touch under full bore on the volume.

    Am I missing something here?

  • #2
    Are you deducting the power dissipated by the load from the input power to get the power dissipated by the tubes?
    WARNING! Musical Instrument amplifiers contain lethal voltages and can retain them even when unplugged. Refer service to qualified personnel.
    REMEMBER: Everybody knows that smokin' ain't allowed in school !

    Comment


    • #3
      I'm simple minded - you lost me on that one!

      I measure plate voltage, cathode voltage, subtract and get working plate voltage. Then divide cathode voltage by cath. resistor value, multiply result by the working plate voltage and divide by two. I usually don't worry about the screens so the dissipation is prob. a touch high.

      Comment


      • #4
        Are you employing a cathode capacitor?
        With the cap, bias should remain relatively stable.
        Without it, it will flucuate with the signal.
        Wiki Link: Cathode bias - Wikipedia, the free encyclopedia

        Comment


        • #5
          Originally posted by EFK View Post
          I'm simple minded - you lost me on that one!

          I measure plate voltage, cathode voltage, subtract and get working plate voltage. Then divide cathode voltage by cath. resistor value, multiply result by the working plate voltage and divide by two. I usually don't worry about the screens so the dissipation is prob. a touch high.
          Let's wait for a second opinion but I don't think you are getting a true measure of plate dissipation under signal conditions of you are measuring with a DVM. Some power is going to the load.
          WARNING! Musical Instrument amplifiers contain lethal voltages and can retain them even when unplugged. Refer service to qualified personnel.
          REMEMBER: Everybody knows that smokin' ain't allowed in school !

          Comment


          • #6
            Yep, I always use cathode caps of varying values. Sometimes the popular AX84 trick (at least, that's where I think I read it) of a big honking 1000uf/100V. But usually around 50 to 100 uf w/ two 6L6 at @ 370 to 400 plate voltage.

            Loudthud, I am using a DVM. I don't thus far get the implications of what you are indicating, unless you mean that the DVM itself is affecting the reading?

            Comment


            • #7
              Let's put how many watts the amp is making aside for a second considering that the amp is making those watts because the power tube grids are being excited by the preamp. If we look at just the bias voltage and plate voltage it's clear that if the voltage on the cathode rises there is a greater negative relationship between the grid and cathode. Add to that the sag in plate voltage and it's even more so. Though it's also true that as the cathode voltage rises there must be greater current through the cathode resistor. But keep in mind that you can make a cold biased amp put out plenty of watts. In some cases more than a hot biased amp. Just because the tubes are dissapating more watts doesn't mean the bias is hotter.

              At least that's how I've reasoned it in my low tech way.

              EDIT: Think of it like this, the bias voltage sets the operating point for the tube. If the grid is more negative WRT the cathode the tube is more inclined to reach cutoff before saturation. If the grid is less negative WRT the cathode it's more inclined to saturate before cutting off. Well not exactly but that's the idea. This is somewhat independant of how much current is going through the tube because of signal at the grid. But it will affect how much swing it takes at the grid to drive the tube to saturation or cutoff.
              Last edited by Chuck H; 08-04-2011, 10:39 PM.
              "Take two placebos, works twice as well." Enzo

              "Now get off my lawn with your silicooties and boom-chucka speakers and computers masquerading as amplifiers" Justin Thomas

              "If you're not interested in opinions and the experience of others, why even start a thread?
              You can't just expect consent." Helmholtz

              Comment


              • #8
                Perhaps use a specific band or sweep rather than only the 1khz signal. Or use a practical method of having someone strum a guitar (with known input voltage) while you take measurements.

                Comment


                • #9
                  Howdy Chuck - I actually understand that. So what is usually considered desirable: grid more negative (wrtc) or less negative (wrtc)? I might go out on a limb and guess - less negative? So the next question then is, using the same procedure I outlined above, I should be measuring both cathode and grid voltage at varying points along the volume? Can I get a decent measurement of grid w/ DVM? This would be AC, correct?

                  Comment


                  • #10
                    Hey tourister - I did think of that, I've tried measuring while whacking on the guitar and honestly an A chord usually drives the amp even harder than the straight 1khz signal.

                    Comment


                    • #11
                      Perhaps take the max. measurement while bypassing the volume pot (and Eq/Tonestacks).

                      And take the min. measurement with a 10M resistor in place of the volume pot.

                      Comment


                      • #12
                        Originally posted by EFK View Post
                        Loudthud, I am using a DVM. I don't thus far get the implications of what you are indicating, unless you mean that the DVM itself is affecting the reading?
                        When a tube amp is making a sinewave near maximum power, the effciency is something around 60%. That's (load power) / (power from B+). Now when the amp is making a clipped square wave, the efficiency goes up to something around 90%. The efficiency is somewhat less for cathode biased amps than it is for fixed bias amps. So a 60W amp consumes about 100W with a full power sine wave and almost 120W making a clipped square wave. When the amp is producing the clipped square wave, the plate dissipation of the tubes is quite low, most likely lower than it is at idle. Most of the power from B+ is going to the load. Maximum plate dissipation occurs somewhere around half power output for class AB amps IIRC.

                        So at idle, (no signal) all the power from B+ goes to the plates (neglecting the screens for the moment) and cathode resistor if any. But when a signal is applied, some power is diverted to the load such that:

                        (B+ power) = (plate dissipation) + (load power) + (cathode resistor power)

                        It can be argued that idle dissipation will affect efficiency. This is true but the effect is small if the bias is within reason. The 70% rule is used in guitar amps because it sounds good. In class AB HiFi amps, the bias is usually less and is set for minimum distortion.
                        WARNING! Musical Instrument amplifiers contain lethal voltages and can retain them even when unplugged. Refer service to qualified personnel.
                        REMEMBER: Everybody knows that smokin' ain't allowed in school !

                        Comment


                        • #13
                          The simpllest way I can think about it is this...

                          We have 2x6V6 in cathode bias at moderate voltage (say 350-380vdc from plate to ground, 20-24vdc on the cathodes & <14W dissipation per tube). We expect this amp to make 12-15W clean (due to limitations of design) & we set the idle current to faciliate that. At idle there is no AC present at the speaker, no sine wave.

                          At max W RMS, dc dissipations may have risen some & voltage sagged a shade, but as long as we are achieving that relatively intact sine wave at expected rated power the amp is deemed to be biased within perameters. If we deliberately bias cold, say <10mA per tube, we may not achieve ANY significant clean W at all. Bias very hot and B+ will be pulled down, cathode & grid swing reduced, ultimately max clean power may drop.

                          However, max clean W RMS power is not the max AC that can be achieved, just the max that can be achieved with a clean sine wave. As we exceed that point, the power tubes spend more & more time in cut off, so whilst the amp may have an idle bias set to facilitate a healthy clean W RMS, as we drive it harder, no matter how much dc the tubes are asked to dissipate, they conduct less & less of their share of the waveform & are therefore closer to cut off, or colder AT THAT OUTPUT, hence the onset of crossover distortion - we're now into the territory of diminishing returns, ("It's clean up to 3, but then it just distorts & compresses") . It's not really "cold" under overdrive, the tubes might be dissipating 20W+ each, just not "hot" enough to keep an intact sine wave under severe drive (outside the scope of expected clean output & reasonable expectation).

                          It's relative, bias isn't a single datum/criteria, requirements change with context, working environment & reasonable expectation. And it's not all about the dc - that's required to facilitate reasonable AC output.

                          Comment


                          • #14
                            EFK, during signal conditions does the average cathode voltage decrease or increase? If it decreases then the average anode current has decreased, and likewise the power dissipation in the valve.

                            Comment


                            • #15
                              He said that it increases, I expect that it would.

                              Comment

                              Working...
                              X