The ampere is the unit used to measure the rate of flow of electric current. One ampere is defined as the current that flows through a conductor with a resistance of one ohm when subjected to a potential difference of one volt.
The ampere, commonly referred to as an “amp”, is a foundational concept in electrical engineering and commercial low-voltage systems. It quantifies the amount of electric charge passing a point in a circuit per second. In practical terms, it helps determine how much current an electrical system is drawing or supplying, which directly impacts conductor sizing, power distribution, and system design.
In commercial and industrial environments where electrical loads vary widely, such as in audiovisual (AV) systems, automation controls, or data infrastructure, the ampere is essential for evaluating system capacity, equipment compatibility, and operational efficiency. Whether designing integrated building technology or selecting cables for power-intensive devices, understanding amperes allows engineers and project managers to maintain performance and reliability while adhering to safety protocols.
While voltage represents electrical potential and resistance describes the opposition to current flow, amperage, or current, is the actual movement of electrons. Ampere ratings are found on everything from circuit breakers and connectors to power supplies and wire gauges, making it one of the most referenced units in the industry.
The unit “ampere” is named after André-Marie Ampère, a 19th-century French physicist and mathematician who was one of the founders of classical electromagnetism. His work laid the groundwork for measuring electric current, and the ampere became an internationally accepted unit of measure under the International System of Units (SI).