Ad Widget

Collapse

Announcement

Collapse
No announcement yet.

Bottom Line: Tube Dissipation and its effects on tone

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    Originally posted by pdf64 View Post
    I suspect that the key aspect being adjusted by the bias control is the conduction angle.
    The plate dissipation at idle required to achieve a reasonable conduction angle may be immaterial.
    ie with a class A amp, if under certain operating conditions, 360 degree conduction occurs with say 80% plate dissipation, there's no theoretical benefit running the plate/s hotter; but might a 'deeper' class A, idling at 100%, sound any different? If so, why?
    No, you're correct, but I'd say it a different way. The conduction angle only has meaning as an independent quantity when the signal is at some reference level. A good reference point for that would be peak output voltage just before clipping happens.

    What you're actually adjusting is how much voltage the signal from the PI has to push the grid to get that tube to turn on, not any particular conduction angle.

    Consider what "Class A" means: neither tube turns off at any signal level. It is POSSIBLE to run PP tubes so that they never turn off but don't put out much plate voltage swing compared to the plate supply, either. If you have a PP output stage biased with each tube conducting at idle, but need 10V of peak signal before one of them turns off, then below 10V peak grid signal (and whatever that translates to on the plates' voltage swing), they are running Class A. That's by definition - neither one ever cuts off entirely. But then you never get to use most of the power supply's ability to supply power or the tubes' ability to manage that power. It's what I'd call a degenerate example. You can do it, easily, but why would you? This only makes sense if the plate DC supply is just a bit bigger than the voltage needed to make the tubes not quite be able to turn off. Any bigger power supply and you're wasting iron, copper, and money. Designs can be, but don't get done this way.

    Class A only makes sense out at the edges, where you're swinging a big signal on the plates, as big as you can and not melt the tubes. The output power is low enough anyway.

    If you have the same output stage, and the same bias point (that is, it takes 10V peak to turn a tube OFF; the ON tube will take care of itself), but raise the plate DC supply a little, then you can put in a signal bigger than 10V peak, and now one tube turns off on peaks, but the other tube keeps conducting. The conduction angle will be, say, 350 degrees for each tube. The DC plate supply can be bigger now, and you get a bit more power out.

    If you raise the plate supply some more, and put in an even bigger grid signal, perhaps 20V peaks, then both tubes are on for all the time the signal is between +/-10V, but one or the other turns completely off when the signal is between 10V and 20V. Exactly what the conduction angle is varies with the signal level.

    It's only when you max out the DC plate voltage and signal drive voltage that you get close to one tube conducting for 180 degrees and the other for 180 degrees. The limit of class B requires that the grid bias on both tubes be exactly enough to cut the tube off completely, minus a gnat's eyelash. When you get to there, you're independent of the DC supply level.

    The class of operation is poorly defined by conduction angle. What all those old texts mean is things like "less than 180 degrees", "more than 180 degrees but less than 360 degrees" WHEN THE SIGNAL ON THE PLATES IS SWINGING AS BIG AS IT CAN GIVEN THE DC POWER SUPPLY IT HAS TO WORK WITH.

    It's a sloppy definition. It would be much easier to understand if instead it was defined as how you get it - the amount of overlapped conduction compared to the cutoff voltage for the tubes. That's what really matters. Unfortunately, that changes from tube to tube.

    and as most designs push their power tubes to/beyond their limits, setting operation to a suitable idle plate dissipation may be the best solution in the real world.
    It's OK as a starting point, but I keep seeing it quoted as the gospel. It's not.

    Well, it may be. There may be some deep set of math involving plate swings with a set of assumptions on grid waveform and the relation of DC supply to peak waveform, etc. that gives this.

    But in the real world, the plate supply can't be set that way. So some percentage of the maxim on the tubes is useful as a gross starting point, but not as something to be aspired to. It may even be a statistical truth, underlying a bigger set of variables than has been considered.

    In the real world, biasing to some percent dissipation probably only means that you're statistically safe there, and that you probably won't run into gross crossover nor runaway tubes.
    Amazing!! Who would ever have guessed that someone who villified the evil rich people would begin happily accepting their millions in speaking fees!

    Oh, wait! That sounds familiar, somehow.

    Comment


    • #17
      Originally posted by Jazz P Bass View Post
      If the bias concept is to simply get out of Class B & into Class AB (thus minimizing crossover distortion) where did the 70% thingy come from?
      It's actually for two things; (1) to get as away from crossover distortion as you're comfortable with and (2) to get as much audio power out as your particular amp's set of DC power supply and output tubes can. Where it came from, I haven't a clue. As you can tell, it mystifies me, even when I go think about it. As I just posted, maybe there's a deeper level of math that results in it being a right(ish) answer, but I don't see it. Yet. As a rule of thumb, a starting point for those who don't want to know or don't have the equipment or disposition to dig it, it's probably safe and OK.

      Ish.

      And if you want to get picky about it, wouldnt a scope be required to actually see the crossover distortion?
      Yes. Proper biasing is done by scope. You bias it cold and just make the crossover notches disappear. Well, you just make them hide inside the higher level conduction of the "on" tube. Most people that use scopes bias til the notches disappear, then add a smidgen more "on" bias, as they know it sounds a little better. The trap there is that smidgen meters are harder to use than even oscilloscopes, and so the smidgens get bigger with time until your tubes get hot.

      I would imagine that you could accomplish the fact by listening as the bias is slowly raised.
      Yes, if ears were calibrated. In fact, most people creep more towards Class A over time, as it does make the amp sound nicer; less crossover is better. They quit when the tubes start overheating.

      A really fussy - or detail oriented - person would get the amp biased by scope to eliminate crossover, then hook in a distortion or spectrum analyzer and turn it until a (1) fixed and (2) written down amount of distortion content was achieved. Crossover drops a lot as you get away from Class B into AB, then drops more slowly as you get more deeply into AB towards A. It doesn't really disappear until you actually hit Class A, and you only get to Class A when your are running maximum grid signal and maximum plate signal at that plate DC supply.

      If you do it by ear, you run a real risk of exceeding the poorly known point where your tubes melt under maximum signal - and you don't see that until you try it at max signal, like in performance. Oops. Class A is seductive. You need a place to stop.

      While there very well may be sonic differences when the bias is raised too high, I don't get why you would want to do that.
      (from a tech stand point)
      You wouldn't. But then many people who bias amps have no clue what they're doing and bias by ear. It's the source of a lot of techs getting amps that sounded really good just before they blew up.

      Then again, tinkering with tube amps is kind of like the what the hotrod dudes did with there cars back in the '50s.
      It is exactly that. We lost a lot of '57 Chevy's that way, though, not knowing that we needed shot peened heads and titanium rods to go with the dual quad carbs.
      Amazing!! Who would ever have guessed that someone who villified the evil rich people would begin happily accepting their millions in speaking fees!

      Oh, wait! That sounds familiar, somehow.

      Comment


      • #18
        Originally posted by Jazz P Bass View Post
        If the bias concept is to simply get out of Class B & into Class AB (thus minimizing crossover distortion) where did the 70% thingy come from?
        * Ignorance

        * "Forum truths"

        * Hearsay.

        * It's a "simple concept", involves simplest Math, zero testing/scoping/whatever.

        Notice nobody never ever shows scope captures of amps with tubes biased at 70% compared to other settings.

        * as most good lies, it involves *some* truth, which make it "believable" , truth is tube curves are quite non linear (duh!), specially gain decreases with colder biasing and increases with hotter one, a regular Musician can easily hear that and think "if there is an improvement in this area, everything else must also improve" .... which may or may not be true.

        * for one, maximum power output is *decreased* ... although a Musician may be mistaken because when testing alone, in bedroom or at low power in a Studio, a hotter biased amp has more gain, so "is louder at 1 or 2 than it was before" ... he will think that this applies at all volume levels.

        * on the other side, *designer* set bias is as low as possible without getting crossover distortion, it's universal except, maybe, among some boutique "designers" which in general are not too hot on the Tech side.


        And if you want to get picky about it, wouldnt a scope be required to actually see the crossover distortion?
        Of course.

        I would imagine that you could accomplish the fact by listening as the bias is slowly raised.
        Hard to hear with a guitar, which is distortion tolerant; try playing classical music or jazz solo piano, lots of complex chord work ... crossover distortion is *unbearable* there.

        While there very well may be sonic differences when the bias is raised too high, I don't get why you would want to do that.
        More gain ... reduced power makes amp easier to overdrive ... all "good" things
        Loss of power is harder to detect and anyway, most musicians nowadays buy way more power than they actually need.
        Although, of course, "somebody might invite them to open for the Rolling Stones" or Woodstock II or something, so better be ready.
        And TV/Video/YT Guitar Gods use a wall of Marshalls anyway.

        End of rant, entering Tech mode:
        Generic 6L6 curves.

        let's look at extremes to make it clearer, the nonlinearity actually is all over the chart.
        Considering 5V grid voltage variation, simply because it's neatly shown, notice the huge current variation when going from -5V to 0V , and the tiny one, from going from -50V to -45V .

        Since the tube is transformer coupled to the load, the speaker, which is *current* driven after all , will move more (i.e. sound louder) if fed more current, so it will sound louder with a hot bias amp than with a cold bias one.

        Why? : voice coil push/pull force is directly proportional to current, the full expression is BLI and is a parameter shown in good speaker datasheets: B:magnetic flux density ; L:wire length immersed in that magnetic field ; Intensity: that passing through the wire , that's why.

        So why don't amp manufacturers hot bias all amps?
        Because , and that is important and nobody mentions it in Forume, tubes are current limited, the higher the idle current the lower the available power , you are wasting current/power capability which in tubes is expensive to get anyway.
        Or: if the roof height is fixed, rising the floor will *reduce* room capacity.
        or:
        Extra idle current is a waste, because max is fixed, tube can only go from idle to Max; transformer does NOT transmit Idle (DC) current or idle tube dissipation, only AC, meaning only the variation from idle to max.

        If tube max current is, say, 250mA , with 50mA idle your current peak can be 200mA ; with 100mA idle it's only 150m A peak and so on.

        That's why an Engineer won't be caught dead using a lot of "useless" idle current.

        That it also increases idle plate dissipation is just another side of the coin, but since it's easier to grasp, it's what's usually mentioned.

        If anybody wants to check earlier posts showing graphic design of tube stages, he may compare 2 designs: first optimize it for max. power, meaning max possible current and voltage swing (max current available, knee voltage, dissipation, must all be considered) ; then repeat the same, changing nothing but idle current ... and calculate power loss.
        It's eye opening

        EDIT: what I called "gain increase" is actually transconductance (mA variation per voltage variation) increase, of course since it's driving a fixed load it translates in practice in a gain increase.

        Edit 2: pure speculation, but I guess "70% dissipation" (or thereabouts) was probably the largest abuse they could get away with.

        Another explanation why "old tubes lasted forever, modern tubes last less than a year" .

        FWIW Stan/km6z mentioned very busy old Russian Pro musicians (think weddings, parties, whatever) had been using the same tubes for 20/30 years or more ... and in general they tested fine ... explanation is that they were used "by the book" , no Leo abuse but factory recommended , say, 360V on plates, 250V on screens, conservative dissipation, and so on.

        Yes, a couple 6L6 put out some 30W instead of 50 W ... plenty enough for Uncle Vania and his accordion .

        As a side note, more than schematics I'd LOVE to see what speakers did those old Soviet amps use .
        Last edited by J M Fahey; 04-11-2016, 12:55 AM.
        Juan Manuel Fahey

        Comment


        • #19
          I need to crack the old amp open. With all this new info at hand it will be interesting testing. RG thank you for all the time spent, and everyone else of coarse. My little experience was done on JJ's that were 3 years worn in. Currently i'm using Old stock GE's (6l6gc) again not new, but with hopefully a lot of life in them. I've been too careful with these GE's, its time to crank upthe heat and hear what happens. A never ending search for tone.

          A quote from "The Guitar Amp Hand Book" By Dave Hunter. "To put it simply, the harder you push the tubes with respect to plate voltage (assuming a bias point adjusted accordingly) the closer you get to achieving the tube type's maximum wattage capability, and a high level of clean headroom." P.61

          ^^This is what established my thoughts on bias points until this thread. I now believe I was understanding his statement incorrectly. He is referring to the B+ voltage to the plates? not the bias point when focusing on levels of distortion?

          I know i'm about to bite off more that I can chew with this statement potentially, but the same exact amp with one at 460 B+ voltage and proper bias will be firmer and cleaner than at 300 B+ volts with proper bias, which will distort more quickly.

          -Dalton

          Comment


          • #20
            Update: So I opened up the ol' amp and got to work. After some sound tests, I went from around 14.5 and 12.7 watts dissipation to 18.8 and 17.1 watts respectively. Sometimes it pays to be wrong, as was the case with my original post. To each their own of coarse, since I attenuate to allow myself to push the amp to its limits, it was best to push the power tubes harder. The distortion was a more pleasing to the ear, more harmonics came through, and I get more of that "tube" sound, I'm sure their is a better term but you get the idea. Once I become more precise in the theories of tube biasing, I may even push the tubes a bit harder just to hear the difference, until then I'm playing it a little safe I suppose.


            -Dalton
            Thanks guys! yet another lesson or 17 has been learned.

            Comment


            • #21
              Originally posted by Justin Thomas View Post
              The few times I've tried twiddling my bias controls so radically, I find that running my tubes at high dissipation results in more hum, a fatter sound, less clean at a particular volume setting. Great for blues. Running them at lower dissipation was less hum, cleaner/thinner - sparkly? But also allowed me to crank the amp on 10 and not kill the tubes. Also seems more pedal-friendly...

              When you crank either on 10, they both distort nicely, depending on what you're after - blues, or metal? Personally, I wonder if the eq of the tone stack doesn't have more to do with it. I play Fenders mostly, but also some Marshalls and custom circuits, so they do exactly what I tell them to do, no matter how hot I bias them.

              I don't follow an internet "rule." I bias where they sound good for what I do, without destroying a good tube in t h e process. I hate to tell folks, but tubes won't be getting any cheaper in the future. Usually ends up somewhere in 50-65% range, depending on the power tubes and amp circuit; Also, I don't want to run so hot that if something in the tube or circuit "self-tweaks" a little, then one tube is pushed over the edge and self-immolates... a slightly cool amp is a whole lot better than a dead amp.

              Stan
              Agree completely with this post. My differences are noted in RED
              Last edited by Tone Meister; 04-19-2016, 05:36 PM.

              Comment


              • #22
                Originally posted by Jazz P Bass View Post
                If the bias concept is to simply get out of Class B & into Class AB (thus minimizing crossover distortion) where did the 70% thingy come from?

                And if you want to get picky about it, wouldnt a scope be required to actually see the crossover distortion?

                I would imagine that you could accomplish the fact by listening as the bias is slowly raised.
                Originally posted by J M Fahey View Post
                Hard to hear with a guitar, which is distortion tolerant; try playing classical music or jazz solo piano, lots of complex chord work ... crossover distortion is *unbearable* there.
                But if you couldn't hear it, what reason do you have to bias the tube hotter to eliminate it? A philosophical discomfort with the presence of inaudible distortion? I think his question makes a lot of sense. If you believe you're biasing the tube hotter to remove an objectionable sound, then it's logical to think the ear should be just as good (or better) than a scope at telling you when that objectionable sound has been sufficiently removed right?

                Comment


                • #23
                  That's one thing I wonder about. We have precision measuring tools but most things are either too much, too little, or just right. I have read that it's easy to bias too hot if you are just using your ear, YMMV.

                  Comment

                  Working...
                  X