News
Why use 50Ω for coaxial cable?
Why use 50Ω as the reference impedance in both radar and radio equipment? Around this problem, there have been many misconceptions in history: "because the size of the first coaxial structure happened to be selected at 50Ω"; "because 50Ω matches well with the antennas used before the 1960s" and so on. In fact, the choice of 50Ω reference impedance is the result of a compromise between the minimum loss and the maximum power capacity of the air coaxial cable. For air dielectric coaxial cables, in order to minimize the loss, the best ratio of the inner diameter of the outer conductor to the outer diameter of the inner conductor is required to be 3.6, which corresponds to an impedance Zo of 77Ω. Although the performance at this time is the best from a loss point of view, this size ratio cannot provide the maximum peak power capacity allowed before dielectric breakdown occurs. At the maximum power capacity, the ratio of the inner diameter of the outer conductor to the inner diameter of the conductor should be 1.65, and the corresponding impedance Zo is 30Ω. The geometric average of 77Ω and 30Ω is approximately 50Ω, as shown in the following formula. In this way, the 50Ω standard is a compromise between the minimum loss and the maximum power capacity of the coaxial cable.