MEPS1996
Badges: 4
Rep:
?
#1
Report Thread starter 5 years ago
#1
I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change. However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?
0
reply
interstitial
Badges: 18
Rep:
?
#2
Report 5 years ago
#2
(Original post by MEPS1996)
I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change. However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?
Current is the flow of charge (carried by electrons in a metallic conductor), and when there is more current, more electrons are flowing and they 'bump into' the atoms of the wire more causing it to heat up.

Posted from TSR Mobile
0
reply
MEPS1996
Badges: 4
Rep:
?
#3
Report Thread starter 5 years ago
#3
(Original post by majmuh24)
Current is the flow of charge (carried by electrons in a metallic conductor), and when there is more current, more electrons are flowing and they 'bump into' the atoms of the wire more causing it to heat up.

Posted from TSR Mobile
My question is to do with the fact that the condition of constant temperature of a resistor for it to have constant resistance implies increases in temperature cause increase in resistance. This, couple with the fact that current increases the temperature, yet resistance is supposed to be the same for all currents, as the I-V graph is a straight line.
0
reply
interstitial
Badges: 18
Rep:
?
#4
Report 5 years ago
#4
(Original post by MEPS1996)
My question is to do with the fact that the condition of constant temperature of a resistor for it to have constant resistance implies increases in temperature cause increase in resistance. This, couple with the fact that current increases the temperature, yet resistance is supposed to be the same for all currents, as the I-V graph is a straight line.
The relationship between current and voltage only stands for a resistor at a constant temperature.

According to Joule's first law, the passing of current through a conductor releases heat.

http://en.m.wikipedia.org/wiki/Joule%27s_first_law

A better way to think about it is to use resistivity, which is temperature dependent, therefore resistance is temperature dependent too.

Posted from TSR Mobile
0
reply
uberteknik
  • Study Helper
Badges: 21
Rep:
?
#5
Report 5 years ago
#5
(Original post by MEPS1996)
I know resistors are supposed to have the same resistance regardless of the current, so long as temperature doesn't change.
This statement is only true for a theoretically ideal resistor. The straight line I-V graph in reality is not achieved by a real resistor.

Real resistors have thresholds (upper limits) of power handling, maximum voltage and maximum currents within which that resistor will behave in a reasonably close approximation to the ideal I-V graph. The actual I-V graph for any given resistor is temperature dependent and that resistance is a function of both the ambient conditions but also the heat generated by the power dissipated in the resistor with a current flowing through it. So definitely not a straight line but, within the manufacturers stated limits, the actual resistance will be a reasonably good approximation to the ideal I-V characteristic.

For instance, the manufacturer will design a resistor for a given power handling quoted in Watts. i.e. 0.5W, 0.25W, 10W etc. There will also be a tolerance rating for the resistance value. 330 ohms +/- 10%, together with a maximum voltage and current limit.

This means that the manufacturer guarantees the actual resistance will be within the limits of 297 ohms and 363 ohms as long as the power dissipated and resistor temperature does not exceed the stated limit.

It also means that as the power or ambient temperature changes, the actual resistance will also change but still remains within the manufacturers stated limits.

(Original post by MEPS1996)
However, I would imagine as current increases, the temperature of the resistor would increase significantly, so how can this be?
You are quite correct. As temperature increases, then the kinetic motion of the resistor atoms increases and the probability of capturing transiting electrons (and therefore those electrons giving up energy) increases. Consequently the resistance rises. That is, the actual resistance increases as a function of temperature.
1
reply
MEPS1996
Badges: 4
Rep:
?
#6
Report Thread starter 5 years ago
#6
(Original post by uberteknik)
This statement is only true for a theoretically ideal resistor. The straight line I-V graph in reality is not achieved by a real resistor.

Real resistors have thresholds (upper limits) of power handling, maximum voltage and maximum currents within which that resistor will behave in a reasonably close approximation to the ideal I-V graph. The actual I-V graph for any given resistor is temperature dependent and that resistance is a function of both the ambient conditions but also the heat generated by the power dissipated in the resistor with a current flowing through it. So definitely not a straight line but, within the manufacturers stated limits, the actual resistance will be a reasonably good approximation to the ideal I-V characteristic.

For instance, the manufacturer will design a resistor for a given power handling quoted in Watts. i.e. 0.5W, 0.25W, 10W etc. There will also be a tolerance rating for the resistance value. 330 ohms +/- 10%, together with a maximum voltage and current limit.

This means that the manufacturer guarantees the actual resistance will be within the limits of 297 ohms and 363 ohms as long as the power dissipated and resistor temperature does not exceed the stated limit.

It also means that as the power or ambient temperature changes, the actual resistance will also change but still remains within the manufacturers stated limits.

You are quite correct. As temperature increases, then the kinetic motion of the resistor atoms increases and the probability of capturing transiting electrons (and therefore those electrons giving up energy) increases. Consequently the resistance rises. That is, the actual resistance increases as a function of temperature.
very good answer thank you very much
0
reply
X

Quick Reply

Attached files
Write a reply...
Reply
new posts
Back
to top
Latest
My Feed

See more of what you like on
The Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

Personalise

University open days

  • Bournemouth University
    Midwifery Open Day at Portsmouth Campus Undergraduate
    Wed, 16 Oct '19
  • Teesside University
    All faculties open Undergraduate
    Wed, 16 Oct '19
  • University of the Arts London
    London College of Fashion – Cordwainers Footwear and Bags & Accessories Undergraduate
    Wed, 16 Oct '19

How has the start of this academic year been for you?

Loving it - gonna be a great year (128)
18.18%
It's just nice to be back! (193)
27.41%
Not great so far... (251)
35.65%
I want to drop out! (132)
18.75%

Watched Threads

View All