posted on Aug, 17 2008 @ 01:21 AM
Here's my best attempt to explain why higher voltage is better without dumping a load of equations:
For any given power level, a higher voltage reduces the required current. Losses increase with the square of the current but only linearly with
resistance.
With low voltage wires, the current would have to be so high it could overheat a wire over it's entire length instead of burning out locally due to
arcing. High current at a poor connection would cause routine overheating of that connection-- heating of the resistance there is a function of
current squared and hence the energy loss. There would not be any benefit as far as reduced risk of fire.
Copper wiring would need 5 times the crossection to service the same appliances if 24V was used. A 15A, 120V outlet would become a 24V, 75A outlet.
Because of the I^2*R law, connection quality would be paramount to reduce loses. Inductive reactance in the wiring would also increase by five times
since electromagnetic field strength is a function of current, not voltage. Appliance motors would be bulkier and run hotter as well for the same
reasons.
I recall reading that the standard for automobiles is due to go to 36V due to the large amount of onboard electronics that require power today, so 1/3
the amperage will have a decent positive impact on efficiency.
Most home appliance motors are fractional horsepower but when you get into large industrial motors, they require 480V up to 13KV or more otherwise the
amperage requirements would be absolutely outrageous (motors can be 100s or 1000s or horsepower). They are also typically three-phase and extremely
efficient-- better than 90%.