The Student Room Group

Resistors and temperature

I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change. However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?
Original post by MEPS1996
I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change. However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?


Current is the flow of charge (carried by electrons in a metallic conductor), and when there is more current, more electrons are flowing and they 'bump into' the atoms of the wire more causing it to heat up.

Posted from TSR Mobile
Reply 2
Original post by majmuh24
Current is the flow of charge (carried by electrons in a metallic conductor), and when there is more current, more electrons are flowing and they 'bump into' the atoms of the wire more causing it to heat up.

Posted from TSR Mobile

My question is to do with the fact that the condition of constant temperature of a resistor for it to have constant resistance implies increases in temperature cause increase in resistance. This, couple with the fact that current increases the temperature, yet resistance is supposed to be the same for all currents, as the I-V graph is a straight line.
Original post by MEPS1996
My question is to do with the fact that the condition of constant temperature of a resistor for it to have constant resistance implies increases in temperature cause increase in resistance. This, couple with the fact that current increases the temperature, yet resistance is supposed to be the same for all currents, as the I-V graph is a straight line.


The relationship between current and voltage only stands for a resistor at a constant temperature.

According to Joule's first law, the passing of current through a conductor releases heat.

http://en.m.wikipedia.org/wiki/Joule%27s_first_law

A better way to think about it is to use resistivity, which is temperature dependent, therefore resistance is temperature dependent too.

Posted from TSR Mobile
(edited 10 years ago)
Original post by MEPS1996
I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change.
This statement is only true for a theoretically ideal resistor. The straight line I-V graph in reality is not achieved by a real resistor.

Real resistors have thresholds (upper limits) of power handling, maximum voltage and maximum currents within which that resistor will behave in a reasonably close approximation to the ideal I-V graph. The actual I-V graph for any given resistor is temperature dependent and that resistance is a function of both the ambient conditions but also the heat generated by the power dissipated in the resistor with a current flowing through it. So definitely not a straight line but, within the manufacturers stated limits, the actual resistance will be a reasonably good approximation to the ideal I-V characteristic.

For instance, the manufacturer will design a resistor for a given power handling quoted in Watts. i.e. 0.5W, 0.25W, 10W etc. There will also be a tolerance rating for the resistance value. 330 ohms +/- 10%, together with a maximum voltage and current limit.

This means that the manufacturer guarantees the actual resistance will be within the limits of 297 ohms and 363 ohms as long as the power dissipated and resistor temperature does not exceed the stated limit.

It also means that as the power or ambient temperature changes, the actual resistance will also change but still remains within the manufacturers stated limits.

Original post by MEPS1996
However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?
You are quite correct. As temperature increases, then the kinetic motion of the resistor atoms increases and the probability of capturing transiting electrons (and therefore those electrons giving up energy) increases. Consequently the resistance rises. That is, the actual resistance increases as a function of temperature.
(edited 10 years ago)
Reply 5
Original post by uberteknik
This statement is only true for a theoretically ideal resistor. The straight line I-V graph in reality is not achieved by a real resistor.

Real resistors have thresholds (upper limits) of power handling, maximum voltage and maximum currents within which that resistor will behave in a reasonably close approximation to the ideal I-V graph. The actual I-V graph for any given resistor is temperature dependent and that resistance is a function of both the ambient conditions but also the heat generated by the power dissipated in the resistor with a current flowing through it. So definitely not a straight line but, within the manufacturers stated limits, the actual resistance will be a reasonably good approximation to the ideal I-V characteristic.

For instance, the manufacturer will design a resistor for a given power handling quoted in Watts. i.e. 0.5W, 0.25W, 10W etc. There will also be a tolerance rating for the resistance value. 330 ohms +/- 10%, together with a maximum voltage and current limit.

This means that the manufacturer guarantees the actual resistance will be within the limits of 297 ohms and 363 ohms as long as the power dissipated and resistor temperature does not exceed the stated limit.

It also means that as the power or ambient temperature changes, the actual resistance will also change but still remains within the manufacturers stated limits.

You are quite correct. As temperature increases, then the kinetic motion of the resistor atoms increases and the probability of capturing transiting electrons (and therefore those electrons giving up energy) increases. Consequently the resistance rises. That is, the actual resistance increases as a function of temperature.

very good answer thank you very much

Quick Reply

Latest