The Student Room Group

Precision vs Sensitivity

What's the difference between the precision and the sensitivity of a measuring instrument.

I thought that the precision has something to do with the least count of the measuring device. A more precise one has a lesser least count. For example a micrometer is more precise than a meter rule?

But the definitions for sensitivity exactly match what I thought of as the precision.

And what's more confusing is that precision is linked to variance in some web sites/ textbooks, but I thought that had something to do with reliability?

Someone answer as soon as possible :smile:
Check this question in one of the papers: pp.JPGp.JPG
I think at this level, we re not supposed to know any further than this definition. i dunno really.
Reply 2
Original post by Daniel Atieh
Check this question in one of the papers: pp.JPGp.JPG
I think at this level, we re not supposed to know any further than this definition. i dunno really.


Yes but that could be said of the sensitivity too, couldn't it?
Original post by RoyalBlue7
Yes but that could be said of the sensitivity too, couldn't it?

That's what i also think. I think that's right.
Original post by RoyalBlue7
What's the difference between the precision and the sensitivity of a measuring instrument.

I thought that the precision has something to do with the least count of the measuring device. A more precise one has a lesser least count. For example a micrometer is more precise than a meter rule?

But the definitions for sensitivity exactly match what I thought of as the precision.

And what's more confusing is that precision is linked to variance in some web sites/ textbooks, but I thought that had something to do with reliability?

Someone answer as soon as possible :smile:


Accuracy refers the value of the actual measurement and how close it refers to the measuring device against which the measurement is made.
Sensitivity refers to the ability of the measuring device itself.
Precision refers to how close to the reference worldwide standard the measurement is made.

For instance: A measurement can be made to an accuracy of +/- 1 cm.
But the scale may have a sensitivity of 1mm.
It's up to the scientist making the measurement to define the accuracy. It's up to the method they choose which will determine how sensitive the measurement can be made.
Precision defines how close that measurement concurs with the reference SI standard.

Think about what measurements and specifically dimensions mean:

The SI base units are all reference standards. Every measurement in physics is with respect to those standards:

Length: Metre
Mass: Kg
Time: Second
etc.

But how are those standards defined? What makes them the reference for all measurements wherever and however they are made?

Take length as an example. Up until 50 years ago, all length measurements were made w.r.t. a reference standard length of a bar of platinum-iridium kept under tightly controlled temperature and pressure conditions.

The bar length had an uncertainty because of its' thermal expansion properties.

All length measurements for the metre throughout the world using the SI standard, were made wrt that bar of precious alloy.

Accuracy is then defined as how close any given measurement can be made when compared with the reference standard. With accuracy comes uncertainty: random errors, systemic errors etc.

If the reference standard is a meter rule, then the accuracy will be how close the measurement can be made to that standard. e.g. anything measured against the mm scale may be accurate to +/- 0.5 mm if the measurement is rounded up or down to the nearest mm mark. That call is made by how confident the scientist judges his measurment ability to be.
The naked eye will be more prone to error than say through a magnifying glass - accuracy is the call of the scientist.

This is not the same as sensitivity which refers to the measuring devices ability itself.

A 2 dp voltmeter may have a sensitivity of +/-0.01 (lsb) but the measurement may still only have a precision of +/- 0.1 because that is all the manufacturer will guarantee as to how close to the SI reference standard any measurement carried out on that instrument can theoretically be made.
(edited 9 years ago)
Original post by RoyalBlue7
What's the difference between the precision and the sensitivity of a measuring instrument.

I thought that the precision has something to do with the least count of the measuring device. A more precise one has a lesser least count. For example a micrometer is more precise than a meter rule?

But the definitions for sensitivity exactly match what I thought of as the precision.

And what's more confusing is that precision is linked to variance in some web sites/ textbooks, but I thought that had something to do with reliability?

Someone answer as soon as possible :smile:


IIRC, sensitivity is the smallest division but precision is how close it is to the 'true' value (so it could display to 0.01mm but still be up to +-0.1mm out)

Posted from TSR Mobile
Reply 6
Thank you

I shall read them closely :smile:

Posted from TSR Mobile
Reply 7
Original post by uberteknik
Accuracy refers the value of the actual measurement and how close it refers to the measuring device against which the measurement is made.
Sensitivity refers to the ability of the measuring device itself.
Precision refers to how close to the reference worldwide standard the measurement is made.

For instance: A measurement can be made to an accuracy of +/- 1 cm.
But the scale may have a sensitivity of 1mm.
It's up to the scientist making the measurement to define the accuracy. It's up to the method they choose which will determine how sensitive the measurement can be made.
Precision defines how close that measurement concurs with the reference SI standard.

Think about what measurements and specifically dimensions mean:

The SI base units are all reference standards. Every measurement in physics is with respect to those standards:

Length: Metre
Mass: Kg
Time: Second
etc.

But how are those standards defined? What makes them the reference for all measurements wherever and however they are made?

Take length as an example. Up until 50 years ago, all length measurements were made w.r.t. a reference standard length of a bar of platinum-iridium kept under tightly controlled temperature and pressure conditions.

The bar length had an uncertainty because of its' thermal expansion properties.

All length measurements for the metre throughout the world using the SI standard, were made wrt that bar of precious alloy.

Accuracy is then defined as how close any given measurement can be made when compared with the reference standard. With accuracy comes uncertainty: random errors, systemic errors etc.

If the reference standard is a meter rule, then the accuracy will be how close the measurement can be made to that standard. e.g. anything measured against the mm scale may be accurate to +/- 0.5 mm if the measurement is rounded up or down to the nearest mm mark. That call is made by how confident the scientist judges his measurment ability to be.
The naked eye will be more prone to error than say through a magnifying glass - accuracy is the call of the scientist.

This is not the same as sensitivity which refers to the measuring devices ability itself.

A 2 dp voltmeter may have a sensitivity of +/-0.01 (lsb) but the measurement may still only have a precision of +/- 0.1 because that is all the manufacturer will guarantee as to how close to the SI reference standard any measurement carried out on that instrument can theoretically be made.


Thank you for replying! Your posts clears a things a lot .. but still I should be sure.

Accuracy refers the value of the actual measurement and how close it refers to the measuring device against which the measurement is made


How do we measure accuracy? Do we give it as an absolute or relative uncertainty?

Sensitivity refers to the ability of the measuring device itself.
Precision refers to how close to the reference worldwide standard the measurement is made.


So this is majmuh also said, wasn't it? That sensitivity is independent of the measurement taken and is an innate property of the measuring instrument. Do we measure sensitivity by the least count?

For instance: A measurement can be made to an accuracy of +/- 1 cm.
But the scale may have a sensitivity of 1mm.


But the edexcel past papers always use the word precision to describe this. If you see the question what Daniel Atieh posted above...

Accuracy is then defined as how close any given measurement can be made when compared with the reference standard. With accuracy comes uncertainty: random errors, systemic errors etc.


Do you mean precision?

If the reference standard is a meter rule, then the accuracy will be how close the measurement can be made to that standard. e.g. anything measured against the mm scale may be accurate to +/- 0.5 mm if the measurement is rounded up or down to the nearest mm mark.


But the uncertainty in the measurement by a meter rule has always been +/- 1mm in the edexcel pastpapers and mark schemes. My physics teacher explained that it should always be the least count; but I always though that the range where it could lie could be expressed to +/- 0.5 mm. Perhaps it is because two readings are taken?

Original post by RoyalBlue7
What's the difference between the precision and the sensitivity of a measuring instrument.

I thought that the precision has something to do with the least count of the measuring device. A more precise one has a lesser least count. For example a micrometer is more precise than a meter rule?

But the definitions for sensitivity exactly match what I thought of as the precision.

And what's more confusing is that precision is linked to variance in some web sites/ textbooks, but I thought that had something to do with reliability?

Someone answer as soon as possible :smile:



Accuracy: how close your measurement is to the "true" value.
Precision/resolution: for an instrument will usually mean the smallest division on the scale or the smallest change in a value that it can measure
Precision: for a set of data refers to how closely grouped that data is about the mean value. More tightly grouped means higher precision.
Sensitivity: usually for electronic devices, this relates to the response of the device to a unit change in the input. For a measuring instrument it relates to the change in the indicated value that is produced by a unit change in the measured value. For example, if you have a thermocouple whose output changes by 1mV when the temperature changes by 1 deg C, you could say its sensitivity is 1mV per degC.
Reply 9
Original post by Stonebridge
Accuracy: how close your measurement is to the "true" value.
Precision/resolution: for an instrument will usually mean the smallest division on the scale or the smallest change in a value that it can measure
Precision: for a set of data refers to how closely grouped that data is about the mean value. More tightly grouped means higher precision.
Sensitivity: usually for electronic devices, this relates to the response of the device to a unit change in the input. For a measuring instrument it relates to the change in the indicated value that is produced by a unit change in the measured value. For example, if you have a thermocouple whose output changes by 1mV when the temperature changes by 1 deg C, you could say its sensitivity is 1mV per degC.


Accuracy: how close your measurement is to the "true" value.

No problem.


Precision: for a set of data refers to how closely grouped that data is about the mean value. More tightly grouped means higher precision.

Then what does reliability mean? In Bio reliability was defined this way

Precision/resolution: for an instrument will usually mean the smallest division on the scale or the smallest change in a value that it can measure

But like majmuh implied that is a definition of sensitivity? Is the precision independent of the measurement?

Sensitivity: usually for electronic devices, this relates to the response of the device to a unit change in the input. For a measuring instrument it relates to the change in the indicated value that is produced by a unit change in the measured value. For example, if you have a thermocouple whose output changes by 1mV when the temperature changes by 1 deg C, you could say its sensitivity is 1mV per degC.

THANK YOU
Original post by majmuh24
IIRC, sensitivity is the smallest division but precision is how close it is to the 'true' value (so it could display to 0.01mm but still be up to +-0.1mm out)

Posted from TSR Mobile


That might be sort of wrong.
Accuracy is how close a value is to the 'true' value.
Not precision.
Precision you can say, is how many significant figures you give it to.

For example, the value of g.

If a bunch of students conduct experiments to find the value of g, one of them obtains a calculated value of 9.5 m/s². Another student gets 9.7 m/s².
These values are calculated equally precisely using the same instruments and experimental set up.
But 9.7 is more accurate compared to 9.5. Because it is closer to the real value (9.8 m/)
(Likewise, if someone obtained 9.8, that'd be great, because that's even more accurate. Duh.)


We use 9.8 m/ as the value of acceleration due to gravity (g) in our calculations.
But we can also use 9.81 m/ instead, which would be a more precise value for g.
Likewise, 9.806 m/ would be even more precise.
Precision means more detail. Finer measurements and stuff. Yeah.

But now if you said 9.80 m/ when rounding off, that wouldn't be accurate. Because that's not true, considering the real value. :biggrin:
Original post by raudhaathif
x


That's when you're talking about data, I think it works slightly differently when you're talking about instruments instead.

Posted from TSR Mobile
Original post by RoyalBlue7
Accuracy: how close your measurement is to the "true" value.

No problem.


Precision: for a set of data refers to how closely grouped that data is about the mean value. More tightly grouped means higher precision.

Then what does reliability mean? In Bio reliability was defined this way

Precision/resolution: for an instrument will usually mean the smallest division on the scale or the smallest change in a value that it can measure

But like majmuh implied that is a definition of sensitivity? Is the precision independent of the measurement?

Sensitivity: usually for electronic devices, this relates to the response of the device to a unit change in the input. For a measuring instrument it relates to the change in the indicated value that is produced by a unit change in the measured value. For example, if you have a thermocouple whose output changes by 1mV when the temperature changes by 1 deg C, you could say its sensitivity is 1mV per degC.

THANK YOU


Unfortunately these words are used with differing meanings in different contexts. They are also interrelated.
We are talking A Level physics, right? Not biology.
Reliability of data refers to the fact that you get the same or similar results on repeated measurement under similar conditions.
Make sure you understand precision/resolution/accuracy for your practical physics.
Precision is not the same sensitivity if you define sensitivity as I did above.
The thermocouple had a sensitivity of, say 1.0 mV per deg C.
This does not tell me directly the precision or resolution of a measurement using it.
If, however, I used a voltmeter that could read the emf of the thermocouple to ±0.5mV then the precision of my temperature reading with that thermocouple and meter would be ±0.5 deg C
So clearly precision depends on sensitivity. The more sensitive the thermocouple, the more precise the reading. As I say, these things all tend to be interrelated.
(edited 9 years ago)
THANK YOU prsom.

Quick Reply

Latest