I grew up learning that wattage was current times voltage.
None is more accurate than the other, all are the same.
So is current or wattage the critical rating here?
This also means our European friends have a slight advantage using 240v vs. 120v.
The wire being short doesn't really help. A shorter wire has less resistance, but the heat produced is distributed over a shorter length.
Yes, "heat produced" will be distributed along a shorter length ... but it will be lower, so temperature will not change.
As in: half the wire length: half the power loss: half the heat produced:same temperature rise.
Or to see it from other angle: doubled wire: doubled loss ... but also double heatsinking ... one compensates the other so actual wire temperature does not increase.
That's why chassis wiring (or home/industrial/distribution wiring) specs only acceptable current per given diameter, the heat is dissipated into surroundings.
Now on transformer wiring, you have this compact block of copper, with minimal ventilation.
As a side note, in my large amps, fans are set up to cool the transformer ... now you can imagine why.
Safe huge power from small (and cheap) transformers
*Always* maximize copper section, to the point of having to press or hammer windings to fit laminations .
Found it the best investment, transformers end up costing (and weighting) less.
Comment