Can you accurately set the bias of a tube amp using cross-over distortion on an O'scope?
Ad Widget
Collapse
Announcement
Collapse
No announcement yet.
Using cross-over distortion to set bias
Collapse
X
-
Thanks Jazz P Bass.
You are very correct! I should have avoided the word "accurately".
So you will know, I am working on a Burgera 333XL and I want to install new output tubes. Burgera does not provide information about biasing output tubes other than using THEIR tubes. I don't own an octal bias tester (I don't like them). With that said, I have never tried to adjust bias using the cross-over distortion method. Perhaps this is the time for me to give it a try?
Comment
-
I like to use it along with plate current measurement. If I can get away with biasing colder without crossover distortion, then I will sometimes go lower (idle current) than what the meter tells me is "on target". This is in the interest of tube reliability/lifespan.
As far as biasing by scope only, not a good idea. Hard to get consistent, reproducible measurements. I suppose if you were working on one particular model of amp all the time (or one particular amp) and you made some benchmark current readings with corresponding scope images, you would get a pretty good feel for biasing with the scope alone.
All that being said, I learned by scope method. If I bias by scope alone, then check with my meter I always end up in a very reasonable ballpark. You need to know your scope fairly well to do this and there are some types of amps that it just won't work (though this is rare).
Yes it is quite a can of worms you opened. Sometimes I think a lot of the opposition to scope biasing is just making another excuse not to own a scope. But that's just my opinion.Originally posted by EnzoI have a sign in my shop that says, "Never think up reasons not to check something."
Comment
-
And seeing as the worms are out of the can, here are some controversial, opinionated, but very interesting articles:
Tales From The Tone Lounge:Biased Opinions
Tales From The Tone Lounge: The Idiot's Guide To (analog) Oscilloscopes!Originally posted by EnzoI have a sign in my shop that says, "Never think up reasons not to check something."
Comment
-
Originally posted by g-one View PostAnd seeing as the worms are out of the can, here are some controversial, opinionated, but very interesting articles:
Tales From The Tone Lounge:Biased Opinions
Tales From The Tone Lounge: The Idiot's Guide To (analog) Oscilloscopes!
Bar none, the best tech site on the net.
Comment
-
My $.02
Use your scope to bias. But not the way it's typically done.
First you need to know how a player will be using the amp. If this is an amp that will be dimed a lot then the user is not going to be happy with the excess crossover distortion that would result from using standard scope method for biasing. So... If the amp will be dimed I say bias so that there is a minimum of crossover distortion with the amp dimed AND within safe parameters. In other words, increase current to minimize crossover distortion but don't overdissapate the tube. A scope can help with this because it will tell you visually what's happening to the crossover distortion at different current levels. It can actually be hard to detect crossover distortion audibly with a sine wave input because it tends not to "swirl". Not to mention the fact that it's unpleasant to listen to a sine wave beating the piss out of an amp. So in this respect it's also easier to look than listen.
If the player usually uses the amp clean I say bias at the minimum current needed to lose crossover distortion before clipping, then increase current a tad to allow for possible changes due to tube wear.
If the player is a tweener, plays clean most of the time but opens it up a little for some dirt occasionally, then have the player set up the gain they would consider max and bias as per the first example.
My position is that any hard fast appliance repair rules that dictate bias procedure simply don't apply to guitar amps because the tubes are being used differently and expected to do different things. Also, any lore or steadfast numbers like "75% of max dissapation" also don't apply simply because they don't take other operating conditions or how the amp is used into consideration.
Bias to the minimum current that gets the job done WRT crossover distortion WRT how the amp will be used. Now...
This can be a lot of current for a dimed amp. If, for example, you have an amp with a pair of 6l6's that had 490V on the plates and almost as much on the screens and the player likes to crank it up you wouldn't want to bias at 90% dissapation at idle just because the tubes aren't over their max and it seems to get rid of some offensive crossover distortion. It just puts too much continuous abuse on the tubes and that's going to be a problem sooner or later. But with all the same criteria except a plate voltage of 390V I wouldn't think twice about biasing to 90% if it sounded right. So you've got to use your noodle too.Last edited by Chuck H; 01-20-2012, 02:55 AM."Take two placebos, works twice as well." Enzo
"Now get off my lawn with your silicooties and boom-chucka speakers and computers masquerading as amplifiers" Justin Thomas
"If you're not interested in opinions and the experience of others, why even start a thread?
You can't just expect consent." Helmholtz
Comment
-
Everyone has provided some great information; thanks!
Today, I tried using the O'scope to set bias. It is easy to see why you wouldn't want to use only an O'scope because adjustment is quite dynamic. Still, I find it an incredibly useful tool. On this topic, why do so many techs refer to crossover distortion as a "notch" - when it appears more like a "bump". Referring to the "notch" confused the heck out of me.
Comment
-
Originally posted by AMPREPAIR View PostEveryone has provided some great information; thanks!
Today, I tried using the O'scope to set bias. It is easy to see why you wouldn't want to use only an O'scope because adjustment is quite dynamic. Still, I find it an incredibly useful tool. On this topic, why do so many techs refer to crossover distortion as a "notch" - when it appears more like a "bump". Referring to the "notch" confused the heck out of me.Last edited by JoeM; 01-20-2012, 03:34 AM."In theory, there is no difference between theory and practice. In practice there is."
- Yogi Berra
Comment
-
Originally posted by Chuck H View PostMy $.02
Use your scope to bias. But not the way it's typically done.
First you need to know how a player will be using the amp. If this is an amp that will be dimed a lot then the user is not going to be happy with the excess crossover distortion that would result from using standard scope method for biasing. So... If the amp will be dimed I say bias so that there is a minimum of crossover distortion with the amp dimed AND within safe parameters. In other words, increase current to minimize crossover distortion but don't overdissapate the tube. A scope can help with this because it will tell you visually what's happening to the crossover distortion at different current levels. It can actually be hard to detect crossover distortion audibly with a sine wave input because it tends not to "swirl". Not to mention the fact that it's unpleasant to listen to a sine wave beating the piss out of an amp. So in this respect it's also easier to look than listen.
If the player usually uses the amp clean I say bias at the minimum current needed to lose crossover distortion before clipping, then increase current a tad to allow for possible changes due to tube wear.
If the player is a tweener, plays clean most of the time but opens it up a little for some dirt occasionally, then have the player set up the gain they would consider max and bias as per the first example.
My position is that any hard fast appliance repair rules that dictate bias procedure simply don't apply to guitar amps because the tubes are being used differently and expected to do different things. Also, any lore or steadfast numbers like "75% of max dissapation" also don't apply simply because they don't take other operating conditions or how the amp is used into consideration.
Bias to the minimum current that gets the job done WRT crossover distortion WRT how the amp will be used. Now...
This can be a lot of current for a dimed amp. If, for example, you have an amp with a pair of 6l6's that had 490V on the plates and almost as much on the screens and the player likes to crank it up you wouldn't want to bias at 90% dissapation at idle just because the tubes aren't over their max and it seems to get rid of some offensive crossover distortion. It just puts too much continuous abuse on the tubes and that's going to be a problem sooner or later. But with all the same criteria except a plate voltage of 390V I wouldn't think twice about biasing to 90% if it sounded right. So you've got to use your noodle too."In theory, there is no difference between theory and practice. In practice there is."
- Yogi Berra
Comment
-
Originally posted by Jazz P Bass View PostAhh, The Tone Lounge.
Bar none, the best tech site on the net.
Comment
-
Originally posted by JoeM View PostChuck, I cant quite get my head around this, but I think you would have to have the power tubes near clipping to set the bias or it would be set very cold. (Assuming someone sets bias with the scope)"Take two placebos, works twice as well." Enzo
"Now get off my lawn with your silicooties and boom-chucka speakers and computers masquerading as amplifiers" Justin Thomas
"If you're not interested in opinions and the experience of others, why even start a thread?
You can't just expect consent." Helmholtz
Comment
-
Originally posted by Chuck H View PostWRT guitar amps I think the tubes should be biased for the players style and not to any arbitrary %age or for maximum clean output (unless that IS the players style). These aren't reference amplifiers or home theatre systems and they aren't used like those amplifiers are. Guitar amplifiers are used as instruments (or at least signal processors) and I think they require a different criteria. Sometimes that means shorter tube life. Not always
When Fender brought out "The Twin" (red knob) it had test points on the back with trim pots beside them. One of the first amps to have bias settings easily adjustable by the user. Recommended setting was .04V per side, which worked out to 20mA per tube, quite cold! Not sure if they got a lot of negative feedback about the sound or what, but the newer model "Twin-amp" had the recommended setting marked as .06V. The owner's manual said this was the best compromise between tone and reliability, but you could set at .04V for max. reliability, or .08V for max. tone. So we have a variance from 20mA to 40mA per tube, quite a difference but all within normal parameters.Originally posted by EnzoI have a sign in my shop that says, "Never think up reasons not to check something."
Comment
Comment