When did the US change from 110v ungrounded line power to 120v grounded? And was it done over the course of a time, or pretty quickly? I'm trying to get a sense of internal voltages in older amps with ungrounded line cords. When I get an amp needing a power cord upgrade, it usually makes sense if the B+ is 10 - 12% higher than the schematic, but I'm not sure where that line should be drawn.
Ad Widget
Collapse
Announcement
Collapse
No announcement yet.
vintage line voltage timeline
Collapse
X
-
Doesn't the ratings plate say the rated AC line voltage? I think that's been required for a long, long time.Amazing!! Who would ever have guessed that someone who villified the evil rich people would begin happily accepting their millions in speaking fees!
Oh, wait! That sounds familiar, somehow.
-
Originally posted by R.G. View PostDoesn't the ratings plate say the rated AC line voltage? I think that's been required for a long, long time.It's weird, because it WAS working fine.....
Comment
-
Nah, they are just saying the amp should work within that range, which should cover most places. What they didn't want was to label something 120v and have someone say "Oh darn, this is for 120v and we have 110 at home."
And if you think that doesn't happen, here is a quote from a question in another forum, Home recording:
I'm currently bidding on a piece of rack gear, just checked in on it and noticed something in the description I'd missed, i says '117 volts.' Is that normal for US? If not, is there a converter?
right by your house.
as to voltage, 110v is a really old spec. Like anything, people hang onto the term. There are folks who to this day refer to the wall outlet as "110". I couldn't tell you when the power industry stopped trying to put 110v in your walls. When I was a kid in the 1950s it was already "117v". And today 120v.
If you are looking for some magic line in the calendar when things went overnight from 110 to 120, I think you will be disappointed. Voltages were rarely stable over time or area. If you lived near a substation, your voltage would be higher than at the end of the branch. Ask anyone who lives at the end of a country road. Voltages have crept upwards.
Grounded outlets have been around a LONG time, just not all homes for a lot of it. Once the electrical code was updated to require it, then all NEW construction had to have the ground, but plenty of older homes still have two hole outlets. MY 100 year old farm house for example. I have a modern service panel in the basement and have run grounded branches to the kitchen and my shop/garage, but not the rest.
I used to see 115v here and there, but that figure was never very popular.
Here is a 1953 transformer catalog, the power transformers are listed as 117vAC.
http://bunkerofdoom.com/xfm/chicago1953/005.jpg
And even 1948 has 117v:
http://bunkerofdoom.com/xfm/STANCOR_...48_WEB/008.jpg
Here is a 1938 catalog, and it shows primaries at 115v.
http://bunkerofdoom.com/xfm/THORDARS...darson400C.pdf
A 1935 catalog also 115v.
So it seems even as early as 1935, "110v" was a passé standard.Education is what you're left with after you have forgotten what you have learned.
Comment
-
Did some looking. Best I can tell from some quick searches:
Thomas Edison invented the light bulb, and set about building generating plants to sell electricity and light bulbs. His carbon filament bulbs worked best in his view at 100Vac. DC has transmission losses that can't be made up for with transformer taps like AC can do, so the generators ran at 110-120 to make up for some of the losses. Electrical distribution was for *lighting*, the killer app that economically powered the electrical revolution.
Tesla invented polyphase AC in the sense we know it now, and George Westinghouse commercialized AC power distribution. Westinghouse and Edison fought it out in the market, sometimes bitterly. Edison, for instance, decided that execution by electrocution, then thought of as a humane way to execute criminals, should be referred to as being "Westinghoused". Westinghouse generated 110Vac nominal to power Edison's bulbs. Still, there are always transmission losses. Over time, more and more places signed on for electric lights to replace the obvious fire hazards of candles and oil lamps, and the advantages of AC distribution won. The generator guys figured out that generating 3-phase was more efficient, and generating at 200+ volts was a good thing for most local power use, so they developed the 200/208/220 system for local transmission, and split the 200+ line into two halves to properly power the 100V nominal light bulbs that were what electrification was originally about.
Since the voltage was already crept up to make up for varying losses, people nearer the generation got higher voltage than those further away, so equipment had to be voltage tolerant. 100vac to 110 was needed from the start. Later, 120Vac tolerance was needed, and metal-filament bulbs did well there and had longer life than carbon filament bulbs.
Then people discovered that you could run motors - and refrigerators! - from AC power, and toasters, and air conditioners and computers... and these vampire apps had to tolerate the vagaries of the electrical power built for light bulbs.
And built crudely by today's standards. 10% was remarkably precise in this time. Over time, the need for absolutely reliable power grew, and having *enough* power trumped the inability to make it precise. The powered device was expected to make the power be as precise as it needed.
The system of three-phase alternating current electrical generation, transmission, and distribution was developed in the 19th century by Nikola Tesla, George Westinghouse and others. Thomas Edison developed direct-current (DC) systems at 110 V and this was claimed to be safer in the battles between proponents of AC and DC supply systems (the War of Currents). Edison chose 110 volts to make high-resistance carbon filament lamps both practical and economically competitive with gas lighting. While higher voltages would reduce the current required for a given quantity of lamps, the filaments would become increasingly fragile and short-lived.[dubious – discuss][citation needed] Edison selected 100 volts for the lamp as a compromise between distribution costs and lamp costs. Generation was maintained at 110 volts to allow for a voltage drop between generator and lamp.In the United States[3] and Canada,[4] national standards specify that the nominal voltage at the source should be 120 V and allow a range of 114 to 126 V (RMS) (−5% to +5%). Historically 110, 115 and 117 volts have been used at different times and places in North America. Main power is sometimes spoken of as 110; however, 120 is the nominal voltage.
Why was 120V chosen as the standard voltage of homes in the US and not some other voltage
Over time it crept up. It's not common to see AC power as high as 135V in suburbs on a spring evening when all the air conditioners that have been humming in the hot mid-day start cycling off in the cool evening and the power adjusters haven't caught it yet.Amazing!! Who would ever have guessed that someone who villified the evil rich people would begin happily accepting their millions in speaking fees!
Oh, wait! That sounds familiar, somehow.
Comment
Comment