I always thought that the proper way to avoid voltage drops was to use low resistance wire with a gauge of adequate ampacity, rather than to overvolt at the supply end. I guess I'm fortunate that I live in an area where we've never been subjected to brownouts or rolling blackouts, and I've never seen my wall voltage sag. Maybe the fact that my distribution transformer is on a pole across the street, and that the run to my house is only about 75 feet is working in my favor. And then there's the fact that the guy who built my house used 10ga wire everywhere when he really didn't need to go that big.
My local electric utility has started a voluntary power program to avoid the problem of synchronous demand by summer air conditioners. They've deployed centrally controlled "smart meters" that are capable of providing a rolling blackout to air conditioners in peoples' homes. They only take $10 per month off of your bill if you sign up for the service, so I haven't bothered.
Going back to transmission losses -- the voltages that are used in power transmission are huge -- as high as 1.38 million volts. The reason that they use extreme voltage in power transmission is so that they don't have to transmit power by transmitting large amounts of current. The process of using high voltage to minimize the amount of current being transmitted helps to avoid resistance-induced power losses. On the transmission side, I don't think that voltage drop is such a big deal. I think that's primarily a problem and the end of the run, after the transmission voltages get reduced to distribution voltages during the "last mile."
In rural applications, the transmission system uses autotransformers that have automatic tap-switching equipment built-in, which effectively causes the transformer to act as a voltage regulator. This type of variable ratio autotransformer serves to compensate for voltage drop on the line.
Like Enzo said, most of the equipment that we're going to plug to our wall socket is going to be able to tolerate a reasonable amount of over-voltage. It doesn't care if the mains is high. In most cases it doesn't care if the mains is a bit on the low side either. Which brings us back to the question -- what was the point of changing the voltage spec at the point of service from 110 to 125?
So far everything I've read about power distribution makes it sound like there's no benefit at the transmission side. Because the use of electrical appliances in the home has proliferated so much in the past 50 years I'm wondering if the reason that the voltages have been increased by 125/110 = 14% is to allow customers to have more power for their appliances without having to draw more current. I suppose the the pessimist might say that the real reason that they do it is because the KWh meter connected to your service drop charges you for power, not voltage, and pushing the voltage up by 14% also increases your bill by 14%.
Keeping the hijack going, I stumbled upon an interesting DOE report about Large Power Transformers used for powering the grid. Some of them weigh 650,000 lb! Wow!
http://www.energy.gov/sites/prod/fil...e%202012_0.pdf
My local electric utility has started a voluntary power program to avoid the problem of synchronous demand by summer air conditioners. They've deployed centrally controlled "smart meters" that are capable of providing a rolling blackout to air conditioners in peoples' homes. They only take $10 per month off of your bill if you sign up for the service, so I haven't bothered.
Going back to transmission losses -- the voltages that are used in power transmission are huge -- as high as 1.38 million volts. The reason that they use extreme voltage in power transmission is so that they don't have to transmit power by transmitting large amounts of current. The process of using high voltage to minimize the amount of current being transmitted helps to avoid resistance-induced power losses. On the transmission side, I don't think that voltage drop is such a big deal. I think that's primarily a problem and the end of the run, after the transmission voltages get reduced to distribution voltages during the "last mile."
In rural applications, the transmission system uses autotransformers that have automatic tap-switching equipment built-in, which effectively causes the transformer to act as a voltage regulator. This type of variable ratio autotransformer serves to compensate for voltage drop on the line.
Like Enzo said, most of the equipment that we're going to plug to our wall socket is going to be able to tolerate a reasonable amount of over-voltage. It doesn't care if the mains is high. In most cases it doesn't care if the mains is a bit on the low side either. Which brings us back to the question -- what was the point of changing the voltage spec at the point of service from 110 to 125?
So far everything I've read about power distribution makes it sound like there's no benefit at the transmission side. Because the use of electrical appliances in the home has proliferated so much in the past 50 years I'm wondering if the reason that the voltages have been increased by 125/110 = 14% is to allow customers to have more power for their appliances without having to draw more current. I suppose the the pessimist might say that the real reason that they do it is because the KWh meter connected to your service drop charges you for power, not voltage, and pushing the voltage up by 14% also increases your bill by 14%.
Keeping the hijack going, I stumbled upon an interesting DOE report about Large Power Transformers used for powering the grid. Some of them weigh 650,000 lb! Wow!
http://www.energy.gov/sites/prod/fil...e%202012_0.pdf
Comment