The Hall voltage is the voltage difference generated across a conductor when it carries an electric current while placed in a magnetic field. This voltage arises due to the interaction of the current and the magnetic field, which causes charge carriers (such as electrons) to accumulate on one side of the conductor, creating a potential difference.
The Hall voltage (\(V_H\)) can be expressed as:
\[
V_H = \frac{B I d}{n e t}
\]
Where:
- \(B\) = Magnetic field strength (in teslas, T)
- \(I\) = Current flowing through the conductor (in amperes, A)
- \(d\) = Thickness of the conductor (in meters, m)
- \(n\) = Number density of charge carriers (in carriers per cubic meter, m³)
- \(e\) = Elementary charge (charge of an electron, \(1.6 \times 10^{-19}\) C)
- \(t\) = Width of the conductor (in meters, m)
In simpler terms:
- The Hall voltage depends on the magnetic field, the amount of current, the dimensions of the conductor, and the properties of the charge carriers.
- The greater the magnetic field, the larger the Hall voltage.
- The number of charge carriers in the material also affects the Hall voltage, and it can help in identifying the type of charge carrier (electrons or holes).
This expression is commonly used in Hall effect experiments to measure the type and density of charge carriers in a material.