Even though high R/X ratio causes more Voltage drop in Distribution System, Distribution lines are designed such that R/X comes more. Whereas for Transmission Line, low R/X ratio reduces ohmic loss, so for Transmission line it is beneficial. Then why Distribution Lines are designed for high R/X?? I am here explaining the reason.
A Transmission line is designed on the basis of power carrying capacity. Higher the voltage lower will be the area of cross-section of conductor for the same power. But by doubling the voltage, we can send more than 4 times the power sent at the lower voltage, making area of cross-section more and hence lower resistance. So the main aim is to evacuate as much power as possible at higher and higher voltages. Note that the aim is not to reduce losses. Its slight reduction is a bonus. Of course X depends on Deq, the equivalent distance between phases (which increases at higher voltages) and Ds, the GMR of the conductor. So we can have for Overhead Transmission lines (X/R) ratio as high as 10 for a 400 kV line.
In Overhead distribution lines the currents flowing is much smaller than that in a Transmission line and the current density is kept small because here the aim is not maximum current or power sent but to minimize voltage drop. So obviously area of cross-section is more and R is less. To this is added the resistance of the distribution transformer, which is relatively high as compared to a power transformer used in Transmission lines. Deq is quite small as the voltage is small, say 400 kV/440 V, so X is smaller. So over all R/X value is larger than that of Transmission lines.
Thank you!
I don’t think Deq and D depends on voltage. It actually depends on configurations of the conductor.
Deq and D does not depends on voltage. Here in the post, as the distance between phases increases with increase in voltage, it is written so. That does not mean the direct relationship of Deq and D with voltgae. I must be explicit though. Thank you.