Transformers are typically rated in **kVA (kilovolt-amperes)**, not **kW (kilowatts)**.
Here's why:
- **kVA** measures the **apparent power** (which combines both real power and reactive power) that a transformer can handle.
- **kW** measures **real power**, which is the actual power used by the load to perform useful work.
Since transformers are designed to transfer both real power (kW) and reactive power (kVAR), their capacity is rated in kVA. The power factor (which is the ratio of real power to apparent power) determines how much of the apparent power is real power (kW).
For example, if a transformer is rated 100 kVA and the load has a power factor of 0.8, then the real power it can provide is:
\[
\text{kW} = \text{kVA} \times \text{Power Factor} = 100 \times 0.8 = 80 \text{ kW}
\]
So, transformers are rated in kVA, but the actual real power (in kW) depends on the power factor of the load connected to it.