Transformers are rated in **kVA** (kilo-volt-amperes) because the rating reflects the **apparent power** (which combines both real power and reactive power) that the transformer can handle, rather than just the real power (in kW).
Here’s why:
1. **Power Factor**: The amount of real power a transformer can deliver depends on the **power factor** of the load (which is typically less than 1). Power factor accounts for the phase difference between voltage and current in AC circuits. Since power factor can vary, the transformer rating in kVA ensures that the transformer is sized to handle the total load (both real and reactive components), regardless of the power factor.
2. **Voltage and Current**: The kVA rating tells you how much voltage and current the transformer can safely handle without considering the type of load it is powering. For instance, a purely resistive load (like a heater) will have a power factor of 1, meaning all the power is real. However, an inductive load (like a motor) or a capacitive load will have a lower power factor, meaning part of the power is used to create magnetic fields (reactive power) rather than doing useful work (real power).
3. **Safety and Design**: By rating transformers in kVA, engineers ensure that the transformer is designed to handle the maximum amount of current it might face, regardless of the nature of the load, without being over-stressed.
In short, kVA is used because it gives a **universal measure** of the transformer's capability that works for all types of loads, whether their power factor is high or low.