Transformers (not the Autobots or Decepticons kind) are a staple in our national power grid. They can be used to efficiently transmit power relatively safely over long distances. To understand how they work, you first must know what power is. Power is the product of voltage times amperage. To give a simple example, if you have 100 volts and 2 amps, you have a power rating of 200 watts. If you increase the voltage to 400 volts and decrease the amperage to .5 amps, you still have a power rating of 200 watts. Transformers are made of two or more inductors which are coil windings that create an electromagnetic field when current powered. This electromagnetic field created on the first inductor (known as the primary) induces a charge on the other inductor in the transformer (known as the secondary). When this secondary is connected to a load like your house, it creates an electromagnetic couple between the power source, through the primary and secondaries in the transformer, to your house to power your electronics.
Voltage and amperage have an inverse relationship in a transformer. When one increases, the other must decrease. Our national power grid uses transformers to transmit high voltages across long distances. But why don’t we transmit large currents? This comes down to efficiency. When current is transmitted through a wire, some of the energy is lost as heat. This power is simply a loss. Large currents could be transmitted, but it would require larger and more insulated wires than we currently have in place. This is where transformers come in. Knowing that voltage and current have an inverse relationship in a transformer, we can transmit high voltages through the wire, which have lower losses than transmitting current, which can be “stepped up” or “stepped down” at the power pole near your house.
Let’s say you’re transmitting 2 kW consisting of 200 volts and 10 amps. To accomplish this using simple numbers and assuming ideal transmission, meaning no line losses, you would start by using a step-up transformer at a 20:1 ratio to bring the voltage up to 4000 volts which has the effect of reducing the current to .5 amps. Near your destination, you would use a step-down transformer of 1:20 ratio to reduce the voltage to 200 volts and raise the current up to 10 amps which still yields the desired 2 kW of power needed to power your TV and refrigerator so you can have a delightful bowl of cereal while watching your Saturday morning cartoons.