# Resistors and temperature Watch

Announcements

Page 1 of 1

Go to first unread

Skip to page:

I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change. However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?

0

reply

Report

#2

(Original post by

I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change. However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?

**MEPS1996**)I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change. However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?

Posted from TSR Mobile

0

reply

(Original post by

Current is the flow of charge (carried by electrons in a metallic conductor), and when there is more current, more electrons are flowing and they 'bump into' the atoms of the wire more causing it to heat up.

Posted from TSR Mobile

**majmuh24**)Current is the flow of charge (carried by electrons in a metallic conductor), and when there is more current, more electrons are flowing and they 'bump into' the atoms of the wire more causing it to heat up.

Posted from TSR Mobile

0

reply

Report

#4

(Original post by

My question is to do with the fact that the condition of constant temperature of a resistor for it to have constant resistance implies increases in temperature cause increase in resistance. This, couple with the fact that current increases the temperature, yet resistance is supposed to be the same for all currents, as the I-V graph is a straight line.

**MEPS1996**)My question is to do with the fact that the condition of constant temperature of a resistor for it to have constant resistance implies increases in temperature cause increase in resistance. This, couple with the fact that current increases the temperature, yet resistance is supposed to be the same for all currents, as the I-V graph is a straight line.

According to Joule's first law, the passing of current through a conductor releases heat.

http://en.m.wikipedia.org/wiki/Joule%27s_first_law

A better way to think about it is to use resistivity, which is temperature dependent, therefore resistance is temperature dependent too.

Posted from TSR Mobile

0

reply

Report

#5

(Original post by

I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change.

**MEPS1996**)I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change.

Real resistors have thresholds (upper limits) of power handling, maximum voltage and maximum currents within which that resistor will behave in a reasonably close approximation to the ideal I-V graph. The actual I-V graph for any given resistor is temperature dependent and that resistance is a function of both the ambient conditions but also the heat generated by the power dissipated in the resistor with a current flowing through it. So definitely not a straight line but, within the manufacturers stated limits, the actual resistance will be a reasonably good approximation to the ideal I-V characteristic.

For instance, the manufacturer will design a resistor for a given power handling quoted in Watts. i.e. 0.5W, 0.25W, 10W etc. There will also be a tolerance rating for the resistance value. 330 ohms +/- 10%, together with a maximum voltage and current limit.

This means that the manufacturer guarantees the actual resistance will be within the limits of 297 ohms and 363 ohms as long as the power dissipated and resistor temperature does not exceed the stated limit.

It also means that as the power or ambient temperature changes, the actual resistance will also change but still remains within the manufacturers stated limits.

(Original post by

However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?

**MEPS1996**)However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?

1

reply

(Original post by

This statement is only true for a theoretically ideal resistor. The straight line I-V graph in reality is not achieved by a real resistor.

Real resistors have thresholds (upper limits) of power handling, maximum voltage and maximum currents within which that resistor will behave in a reasonably close approximation to the ideal I-V graph. The actual I-V graph for any given resistor is temperature dependent and that resistance is a function of both the ambient conditions but also the heat generated by the power dissipated in the resistor with a current flowing through it. So definitely not a straight line but, within the manufacturers stated limits, the actual resistance will be a reasonably good approximation to the ideal I-V characteristic.

For instance, the manufacturer will design a resistor for a given power handling quoted in Watts. i.e. 0.5W, 0.25W, 10W etc. There will also be a tolerance rating for the resistance value. 330 ohms +/- 10%, together with a maximum voltage and current limit.

This means that the manufacturer guarantees the actual resistance will be within the limits of 297 ohms and 363 ohms as long as the power dissipated and resistor temperature does not exceed the stated limit.

It also means that as the power or ambient temperature changes, the actual resistance will also change but still remains within the manufacturers stated limits.

You are quite correct. As temperature increases, then the kinetic motion of the resistor atoms increases and the probability of capturing transiting electrons (and therefore those electrons giving up energy) increases. Consequently the resistance rises. That is, the actual resistance increases as a function of temperature.

**uberteknik**)This statement is only true for a theoretically ideal resistor. The straight line I-V graph in reality is not achieved by a real resistor.

Real resistors have thresholds (upper limits) of power handling, maximum voltage and maximum currents within which that resistor will behave in a reasonably close approximation to the ideal I-V graph. The actual I-V graph for any given resistor is temperature dependent and that resistance is a function of both the ambient conditions but also the heat generated by the power dissipated in the resistor with a current flowing through it. So definitely not a straight line but, within the manufacturers stated limits, the actual resistance will be a reasonably good approximation to the ideal I-V characteristic.

For instance, the manufacturer will design a resistor for a given power handling quoted in Watts. i.e. 0.5W, 0.25W, 10W etc. There will also be a tolerance rating for the resistance value. 330 ohms +/- 10%, together with a maximum voltage and current limit.

This means that the manufacturer guarantees the actual resistance will be within the limits of 297 ohms and 363 ohms as long as the power dissipated and resistor temperature does not exceed the stated limit.

It also means that as the power or ambient temperature changes, the actual resistance will also change but still remains within the manufacturers stated limits.

You are quite correct. As temperature increases, then the kinetic motion of the resistor atoms increases and the probability of capturing transiting electrons (and therefore those electrons giving up energy) increases. Consequently the resistance rises. That is, the actual resistance increases as a function of temperature.

0

reply

X

Page 1 of 1

Go to first unread

Skip to page:

### Quick Reply

Back

to top

to top