In the International System of Units (SI), the unit of electrical resistance is the ohm, symbolized by the Greek letter Ω. It measures how strongly a material opposes the flow of electric current.
1. **Definition**: One ohm is defined as the resistance between two points of a conductor when a constant potential difference of one volt (V) applied between these points produces a current of one ampere (A). In mathematical terms:
\[
1 \, \Omega = \frac{1 \, \text{V}}{1 \, \text{A}}
\]
2. **Relation to Other SI Units**: The ohm can be expressed in terms of the base SI units:
\[
1 \, \Omega = 1 \, \text{kg} \cdot \text{m}^2 \cdot \text{s}^{-3} \cdot \text{A}^{-2}
\]
Where:
- \(\text{kg}\) is kilograms (unit of mass),
- \(\text{m}\) is meters (unit of length),
- \(\text{s}\) is seconds (unit of time),
- \(\text{A}\) is amperes (unit of electric current).
3. **Practical Example**: If you have a resistor with a resistance of 100 ohms, it means that applying 1 volt of potential difference across this resistor will cause a current of 0.01 amperes (or 10 milliamperes) to flow through it.
The ohm is a fundamental unit in electrical engineering and physics, crucial for analyzing and designing electrical circuits.