Ad Widget

Collapse

Announcement

Collapse
No announcement yet.

Using cross-over distortion to set bias

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Using cross-over distortion to set bias

    Can you accurately set the bias of a tube amp using cross-over distortion on an O'scope?

  • #2
    Wow.
    And the can of worms is open!
    The key word her is accurately.
    Accurately what?
    What is the goal?
    If you simply want to set the tube bias to get rid of distortion, yes (depending on output signal amplitude).

    Comment


    • #3
      Thanks Jazz P Bass.
      You are very correct! I should have avoided the word "accurately".
      So you will know, I am working on a Burgera 333XL and I want to install new output tubes. Burgera does not provide information about biasing output tubes other than using THEIR tubes. I don't own an octal bias tester (I don't like them). With that said, I have never tried to adjust bias using the cross-over distortion method. Perhaps this is the time for me to give it a try?

      Comment


      • #4
        IMO, not really, because crossover with tubes is pretty gradual, unlike a solid-state amp where you get an obvious notch.
        "Enzo, I see that you replied parasitic oscillations. Is that a hypothesis? Or is that your amazing metal band I should check out?"

        Comment


        • #5
          I like to use it along with plate current measurement. If I can get away with biasing colder without crossover distortion, then I will sometimes go lower (idle current) than what the meter tells me is "on target". This is in the interest of tube reliability/lifespan.
          As far as biasing by scope only, not a good idea. Hard to get consistent, reproducible measurements. I suppose if you were working on one particular model of amp all the time (or one particular amp) and you made some benchmark current readings with corresponding scope images, you would get a pretty good feel for biasing with the scope alone.
          All that being said, I learned by scope method. If I bias by scope alone, then check with my meter I always end up in a very reasonable ballpark. You need to know your scope fairly well to do this and there are some types of amps that it just won't work (though this is rare).
          Yes it is quite a can of worms you opened. Sometimes I think a lot of the opposition to scope biasing is just making another excuse not to own a scope. But that's just my opinion.
          Originally posted by Enzo
          I have a sign in my shop that says, "Never think up reasons not to check something."


          Comment


          • #6
            And seeing as the worms are out of the can, here are some controversial, opinionated, but very interesting articles:
            Tales From The Tone Lounge:Biased Opinions
            Tales From The Tone Lounge: The Idiot's Guide To (analog) Oscilloscopes!
            Originally posted by Enzo
            I have a sign in my shop that says, "Never think up reasons not to check something."


            Comment


            • #7
              Originally posted by g-one View Post
              And seeing as the worms are out of the can, here are some controversial, opinionated, but very interesting articles:
              Tales From The Tone Lounge:Biased Opinions
              Tales From The Tone Lounge: The Idiot's Guide To (analog) Oscilloscopes!
              Ahh, The Tone Lounge.
              Bar none, the best tech site on the net.

              Comment


              • #8
                My $.02

                Use your scope to bias. But not the way it's typically done.

                First you need to know how a player will be using the amp. If this is an amp that will be dimed a lot then the user is not going to be happy with the excess crossover distortion that would result from using standard scope method for biasing. So... If the amp will be dimed I say bias so that there is a minimum of crossover distortion with the amp dimed AND within safe parameters. In other words, increase current to minimize crossover distortion but don't overdissapate the tube. A scope can help with this because it will tell you visually what's happening to the crossover distortion at different current levels. It can actually be hard to detect crossover distortion audibly with a sine wave input because it tends not to "swirl". Not to mention the fact that it's unpleasant to listen to a sine wave beating the piss out of an amp. So in this respect it's also easier to look than listen.

                If the player usually uses the amp clean I say bias at the minimum current needed to lose crossover distortion before clipping, then increase current a tad to allow for possible changes due to tube wear.

                If the player is a tweener, plays clean most of the time but opens it up a little for some dirt occasionally, then have the player set up the gain they would consider max and bias as per the first example.

                My position is that any hard fast appliance repair rules that dictate bias procedure simply don't apply to guitar amps because the tubes are being used differently and expected to do different things. Also, any lore or steadfast numbers like "75% of max dissapation" also don't apply simply because they don't take other operating conditions or how the amp is used into consideration.

                Bias to the minimum current that gets the job done WRT crossover distortion WRT how the amp will be used. Now...

                This can be a lot of current for a dimed amp. If, for example, you have an amp with a pair of 6l6's that had 490V on the plates and almost as much on the screens and the player likes to crank it up you wouldn't want to bias at 90% dissapation at idle just because the tubes aren't over their max and it seems to get rid of some offensive crossover distortion. It just puts too much continuous abuse on the tubes and that's going to be a problem sooner or later. But with all the same criteria except a plate voltage of 390V I wouldn't think twice about biasing to 90% if it sounded right. So you've got to use your noodle too.
                Last edited by Chuck H; 01-20-2012, 02:55 AM.
                "Take two placebos, works twice as well." Enzo

                "Now get off my lawn with your silicooties and boom-chucka speakers and computers masquerading as amplifiers" Justin Thomas

                "If you're not interested in opinions and the experience of others, why even start a thread?
                You can't just expect consent." Helmholtz

                Comment


                • #9
                  Everyone has provided some great information; thanks!
                  Today, I tried using the O'scope to set bias. It is easy to see why you wouldn't want to use only an O'scope because adjustment is quite dynamic. Still, I find it an incredibly useful tool. On this topic, why do so many techs refer to crossover distortion as a "notch" - when it appears more like a "bump". Referring to the "notch" confused the heck out of me.

                  Comment


                  • #10
                    Originally posted by AMPREPAIR View Post
                    Everyone has provided some great information; thanks!
                    Today, I tried using the O'scope to set bias. It is easy to see why you wouldn't want to use only an O'scope because adjustment is quite dynamic. Still, I find it an incredibly useful tool. On this topic, why do so many techs refer to crossover distortion as a "notch" - when it appears more like a "bump". Referring to the "notch" confused the heck out of me.
                    I'm not sure, but maybe on a SS amp crossover distortion may appear more as a 'notch'.
                    Last edited by JoeM; 01-20-2012, 03:34 AM.
                    "In theory, there is no difference between theory and practice. In practice there is."
                    - Yogi Berra

                    Comment


                    • #11
                      Originally posted by Chuck H View Post
                      My $.02

                      Use your scope to bias. But not the way it's typically done.

                      First you need to know how a player will be using the amp. If this is an amp that will be dimed a lot then the user is not going to be happy with the excess crossover distortion that would result from using standard scope method for biasing. So... If the amp will be dimed I say bias so that there is a minimum of crossover distortion with the amp dimed AND within safe parameters. In other words, increase current to minimize crossover distortion but don't overdissapate the tube. A scope can help with this because it will tell you visually what's happening to the crossover distortion at different current levels. It can actually be hard to detect crossover distortion audibly with a sine wave input because it tends not to "swirl". Not to mention the fact that it's unpleasant to listen to a sine wave beating the piss out of an amp. So in this respect it's also easier to look than listen.

                      If the player usually uses the amp clean I say bias at the minimum current needed to lose crossover distortion before clipping, then increase current a tad to allow for possible changes due to tube wear.

                      If the player is a tweener, plays clean most of the time but opens it up a little for some dirt occasionally, then have the player set up the gain they would consider max and bias as per the first example.

                      My position is that any hard fast appliance repair rules that dictate bias procedure simply don't apply to guitar amps because the tubes are being used differently and expected to do different things. Also, any lore or steadfast numbers like "75% of max dissapation" also don't apply simply because they don't take other operating conditions or how the amp is used into consideration.

                      Bias to the minimum current that gets the job done WRT crossover distortion WRT how the amp will be used. Now...

                      This can be a lot of current for a dimed amp. If, for example, you have an amp with a pair of 6l6's that had 490V on the plates and almost as much on the screens and the player likes to crank it up you wouldn't want to bias at 90% dissapation at idle just because the tubes aren't over their max and it seems to get rid of some offensive crossover distortion. It just puts too much continuous abuse on the tubes and that's going to be a problem sooner or later. But with all the same criteria except a plate voltage of 390V I wouldn't think twice about biasing to 90% if it sounded right. So you've got to use your noodle too.
                      Chuck, I cant quite get my head around this, but I think you would have to have the power tubes near clipping to set the bias or it would be set very cold. (Assuming someone sets bias with the scope)
                      "In theory, there is no difference between theory and practice. In practice there is."
                      - Yogi Berra

                      Comment


                      • #12
                        Originally posted by Jazz P Bass View Post
                        Ahh, The Tone Lounge.
                        Bar none, the best tech site on the net.
                        Let me suggest Randall Aiken. His site isn't working but you can get to the pages from the search results. Do click on "more results..."
                        My rants, products, services and incoherent babblings on my blog.

                        Comment


                        • #13
                          Originally posted by JoeM View Post
                          Chuck, I cant quite get my head around this, but I think you would have to have the power tubes near clipping to set the bias or it would be set very cold. (Assuming someone sets bias with the scope)
                          If the amp is used cplipping the power tubes hard and you want to minimise crossover distortion then you would need to have the amp clipping hard when you adjust the bias for minimum crossover distortion. But of course you then need to check the idle current since you don't want the tubes overdissapating (or nearly) at idle and also working as hard as they can when conducting. In fact, on an AB1 amp with moderate to high plate voltage, if you attempt to bias out all crossover distortion under hard clipping you will likely end up overdissapating the tubes at idle. You need to find a safe and acceptible compromise. There are still other facets to it that are also relative to playing style. Like trying to get that classic metal/hard rock sound. Biasing for more current can make the amp feel softer and less dynamic which is bad for this style of playing. So you bias for a little less current and end up with a little more crossover distortion but better dynamics. My point was just that WRT guitar amps I think the tubes should be biased for the players style and not to any arbitrary %age or for maximum clean output (unless that IS the players style). These aren't reference amplifiers or home theatre systems and they aren't used like those amplifiers are. Guitar amplifiers are used as instruments (or at least signal processors) and I think they require a different criteria. Sometimes that means shorter tube life. Not always. Guys who work with tool steel go through more files than guys who work with mild steel. And guitar amps often go through more tubes than home stereo amps. Smoke 'em if you got 'em.
                          "Take two placebos, works twice as well." Enzo

                          "Now get off my lawn with your silicooties and boom-chucka speakers and computers masquerading as amplifiers" Justin Thomas

                          "If you're not interested in opinions and the experience of others, why even start a thread?
                          You can't just expect consent." Helmholtz

                          Comment


                          • #14
                            Originally posted by Chuck H View Post
                            WRT guitar amps I think the tubes should be biased for the players style and not to any arbitrary %age or for maximum clean output (unless that IS the players style). These aren't reference amplifiers or home theatre systems and they aren't used like those amplifiers are. Guitar amplifiers are used as instruments (or at least signal processors) and I think they require a different criteria. Sometimes that means shorter tube life. Not always
                            This is an important point. Safe parameters leave quite a bit of leeway for individual preference. There are no magic numbers.
                            When Fender brought out "The Twin" (red knob) it had test points on the back with trim pots beside them. One of the first amps to have bias settings easily adjustable by the user. Recommended setting was .04V per side, which worked out to 20mA per tube, quite cold! Not sure if they got a lot of negative feedback about the sound or what, but the newer model "Twin-amp" had the recommended setting marked as .06V. The owner's manual said this was the best compromise between tone and reliability, but you could set at .04V for max. reliability, or .08V for max. tone. So we have a variance from 20mA to 40mA per tube, quite a difference but all within normal parameters.
                            Originally posted by Enzo
                            I have a sign in my shop that says, "Never think up reasons not to check something."


                            Comment


                            • #15
                              Originally posted by Jazz P Bass View Post
                              Ahh, The Tone Lounge.
                              Bar none, the best tech site on the net.
                              Tone Lizard For The Win!

                              Comment

                              Working...
                              X