Capacitance is the property of a system of conductors and dielectrics that allows it to store electrical energy when a potential difference exists between the conductors.
Capacitance measures how much electric charge a system can store per unit of voltage and is influenced by the surface area of the conductors, the spacing between them, and the dielectric material that separates them. While most commonly associated with capacitors, capacitance also plays an important role in wire, cable, and circuit performance. In cables, it affects how efficiently electrical signals or data can travel through the conductors.
In cable design, factors such as conductor size, insulation type, and spacing determine capacitance. Higher capacitance increases signal attenuation and distortion, which can degrade transmission quality, especially over long distances or high-frequency applications. For that reason, low-capacitance cables are preferred in data, audiovisual, and control systems to preserve signal clarity. Capacitance is measured in farads (F), though microfarads (µF), nanofarads (nF), and picofarads (pF) are more common for cabling applications.
Capacitance testing and performance standards are defined by organizations such as IEEE (Institute of Electrical and Electronics Engineers), IEC (International Electrotechnical Commission), and UL (Underwriters Laboratories). These standards establish accepted testing methods and reference values for cable and conductor performance.
The study of capacitance began in the 18th century with the invention of the Leyden jar, one of the earliest devices to store electrical charge. This discovery led to the development of capacitors and other components used to regulate voltage and signal flow. Advances in dielectric materials and cable manufacturing have since refined control over capacitance, enabling modern systems to achieve greater efficiency, signal stability, and precision in power and communication networks.