What's the difference between the precision and the sensitivity of a measuring instrument.
I thought that the precision has something to do with the least count of the measuring device. A more precise one has a lesser least count. For example a micrometer is more precise than a meter rule?
But the definitions for sensitivity exactly match what I thought of as the precision.
And what's more confusing is that precision is linked to variance in some web sites/ textbooks, but I thought that had something to do with reliability?
Someone answer as soon as possible