Ad Widget

Collapse

Announcement

Collapse
No announcement yet.

Increased AC mains voltages in the US--why?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • #16
    There is much doubt, because you have never tried it yourself. It gets hot as hell.

    Comment


    • #17
      Like I said, if it bothers anyone, the Vintage Voltage adapter is an easy fix. I guess there is much doubt because soundguruman never tried building one himself.

      I once took some old 50s vintage transformers intended for 115V, 60Hz and ran a pair in series off 240V, 50Hz. Yep, they got pretty hot. RG's adapter circuit helped a little, but the wrong frequency was more than it could cope with. A 16% under frequency is the same as a 16% overvoltage, as far as core losses are concerned.
      "Enzo, I see that you replied parasitic oscillations. Is that a hypothesis? Or is that your amazing metal band I should check out?"

      Comment


      • #18
        We have seen many, many of these power transformers burn up, partially because the amp was set and biased for 110 volts, then over voltaged.
        Also the difference between 50 cycle and 60 cycle current is a major contributor. The original transformer was made for 50 cycle AC.
        If the voltage makes no difference, how come they changed the design, to 120 volt 60 cycle transformers, in the later models?
        Answer: because they burned up.
        An experienced tech would quickly recognize that the amp was running too hot. An inexperienced person would not grasp the problem.

        And as long as I am on the subject, the originals did not have screen grid resistors. When the design of EL34 was changed, this made the amp blow fuses. Install the 1K screen resistors to help reduce the current.

        Comment


        • #19
          Sure. But the point is, the amp doesn't burn up because a 5% or 10% increase in line voltage makes the transformer burn up.

          It burns up because the high line voltage makes all the other voltages in the amp too high, and that makes the tubes and filter caps wear out prematurely.

          I do agree that a Plexi without screen resistors is a bad idea. Even with 1k the screens of EL34s are getting abused.
          "Enzo, I see that you replied parasitic oscillations. Is that a hypothesis? Or is that your amazing metal band I should check out?"

          Comment


          • #20
            abusing the EL34, turn the bias down until it turns a tiny bit red...
            until the output tubes are screaming for mercy.
            That's when it sounds good.

            Comment


            • #21
              1) The "doubt" bit was written to be kind towards you.
              Since thatīs wasted on you, Iīll rephrase my words:
              "No transformer in the world burns for receiving 10% more primary voltage" Period.
              Or if you want it closer to your original statement:
              "No transformer, which, so far had been running normally, becomes
              hot as Hades, meltdown eminent.
              after receiving 10% more primary voltage"
              If anything, because any "normal" transformer *is* designed considering that. Itīs standard procedure. Period.
              Iīm talking about *that* parameter, donīt add extra ones after the fact, such as biasing problems or wrong screen resistors.

              2) You also got it wrong.
              Also the difference between 50 cycle and 60 cycle current is a major contributor. The original transformer was made for 50 cycle AC.
              True, but you didnīt understand it.
              Going from *original* (British) 50Hz to your experience USA 60Hz (you do live in the USA, donīt you?) makes things easier for the transformer, the opposite to what you state.
              3)
              If the voltage makes no difference, how come they changed the design, to 120 volt 60 cycle transformers, in the later models?
              Answer: because they burned up.
              The voltage does make a difference, just not to the catastrophic ends you mention, by a long way.
              If primary voltage changes, of course it will be considered by the designer.
              Making a 60Hz only transformer is a pity, because it loses versatility. It does save some pennies though, which in a cut throat competition world means a lot.
              It does not make it "stronger" as you assume, but "weaker".
              To be more precise, itīs not "overbuilt" as a 50Hz one would.
              Juan Manuel Fahey

              Comment


              • #22
                Originally posted by J M Fahey View Post
                Making a 60Hz only transformer is a pity, because it loses versatility. It does save some pennies though, which in a cut throat competition world means a lot.
                It does not make it "stronger" as you assume, but "weaker".
                To be more precise, itīs not "overbuilt" as a 50Hz one would.
                As the frequency goes down, the transformer size has to increase if they're built to operate at the same temperature.
                A DC transformer would have to be infinitely large.....

                Comment


                • #23
                  As the frequency goes down, the transformer size has to increase if they're built to operate at the same temperature.
                  True, thatīs what Iīm saying.
                  To keep things in context, we are talking about Marshall Power transformers, which being "UK born" were designed for 50Hz.
                  Of course, they were multi-tap, were exported to USA and, as I said, had a somewhat easier life (speaking of frequency).
                  Although designing a slightly cheaper 60Hz-only one was technically possible, I guess the extra inventory and ordering hassle was not worth it.
                  As an example, I build amps commercially, in series, (around 20 a month), I use the 100W power transformer *also* in the 60W one, for the exact same reason mentioned above.
                  Plus, Marshalls were "expensive" amps, didnīt *need* to slash a few cents.
                  Now it has changed: they sell cheaper amps to "Joe-over-the-corner", are made in China, Korea, India, Vietnam or some other rice-fed country, and in much higher quantities; so making differentiated transformers for different markets (Countries) makes more sense.
                  They even make a special "Argentina" version, go figure !!
                  They had to meet our new "CA" electrical safety rules which are quite tough.
                  Have fun.
                  Juan Manuel Fahey

                  Comment


                  • #24
                    Originally posted by J M Fahey View Post
                    Sorry but I very much doubt that a 10% voltage increase applied to a power transformer primary (120V vs 110V) sends it close to a meltdown just by itself.
                    I would agree with you--if a PT were designed with some degree of safety margin, but that's not always the case. There's a consensus, for example, that David Hafler under-designed the power transformers of many vintage Dynaco tube amps. They tend to run *very* hot, and it's not uncommon for one to burn itself up without abuse or a power supply short. With its original PT, you could watch the B+ supply of my ST-70 drop as the PT heated up over the course of an hour or so. I've now got an aftermarket PT in it, which runs much cooler, even after it's been on a while. The B+ is also much more stable.

                    So, if a PT is already running at the bleeding edge of its capabilities, a 10% voltage increase may be the last straw.

                    This might give rise to a new thread: What amps or makes of amps are known for inadequate power transformers?

                    Comment


                    • #25
                      There's a consensus, for example, that David Hafler under-designed the power transformers of many vintage Dynaco tube amps. They tend to run *very* hot, and it's not uncommon for one to burn itself up without abuse or a power supply short.
                      Fine.
                      Donīt worry, we are not disagreeing because we are talking about different things.
                      You searched as an example for a transformer which was so poorly designed that even used "normal", overheats and even burns all by itself.
                      Really, a bad apple.

                      We were talking Plexis here, meaning well made British Dagnalls and Drakes, which even more, benefited from being used with 60Hz mains instead of the original 50Hz.
                      Different stuff.

                      The raising of line frequency has two favorable (and related) advantages:

                      1) since the inductance is higher than strictly needed, magnetizing current is lower.

                      For those in doubt, magnetizing curent is the current absorbed by the transformer primary with the secondary unloaded, so itīs an intrinsic loss of any real world transformer.

                      In any well designed transformer, itīs value is only a few percent of the maximum current absorbed when fully loaded
                      Meaning: if itīs , say, 5% of maximum current and our transformer is calculated for, say, 2 Amp primary current (220VA @ 110V), magnetizing current will be around 2000mAx0.05=100mA.
                      If now we apply 120V to that primary, magnetizing current will rise to 100x1.1=110 mA. Real increase: 10mA.
                      Itīs easy to see that a 10mA increase in a transformer rated for 2A (2000mA) primary current is nil.

                      Transformers are "mechanically" simple (just some wire and iron) but not so "magnetically" , because hysteresis curves are *anything* but linear.
                      Just look at *any* transformer iron curves.

                      Thatīs why what *looks* intuitive, is not so, and proper results demand doing the Math, rather than guessing.

                      2) rising the frequency lowers the iron saturation by the exact same amount, 20%, which clearly more than compensates for the 10% primary voltage increase we are talking about.

                      As I said before, we are talking the transformer itself here.

                      Poor biasing, cheap design , etc. are *something else*.

                      EDIT: if somebody wants to get somewhat deeper into this:
                      http://mysite.du.edu/~jcalvert/tech/transfor.htm
                      Last edited by J M Fahey; 12-11-2011, 03:14 AM. Reason: Found interesting page
                      Juan Manuel Fahey

                      Comment

                      Working...
                      X