# Statistics

Watch
Announcements

Page 1 of 1

Go to first unread

Skip to page:

I dont know much about stats, so this may be a stupid question.

Say like we had some data, no. of lives lost playing a level 99 in a game.

You calculate the average life lost to be 6.

But the standard deviation is high, then this average is unreliable?

Thanks!

Say like we had some data, no. of lives lost playing a level 99 in a game.

You calculate the average life lost to be 6.

But the standard deviation is high, then this average is unreliable?

Thanks!

0

reply

Report

#2

(Original post by

I dont know much about stats, so this may be a stupid question.

Say like we had some data, no. of lives lost playing a level 99 in a game.

You calculate the average life lost to be 6.

But the standard deviation is high, then this average is unreliable?

Thanks!

**Zenarthra**)I dont know much about stats, so this may be a stupid question.

Say like we had some data, no. of lives lost playing a level 99 in a game.

You calculate the average life lost to be 6.

But the standard deviation is high, then this average is unreliable?

Thanks!

The average isn't "unreliable" - the average just tells you one piece of information which may or may not be useful (in technical jargon it's an example of the "central tendency" of the data, something that tells you what a "typical" value should look like).

The standard deviation is one measure of "spread" or "dispersion" of your data, so is telling you something different about the data.

The two measures don't contradict each other; they give you different pieces of information!

0

reply

Report

#3

**Zenarthra**)

I dont know much about stats, so this may be a stupid question.

Say like we had some data, no. of lives lost playing a level 99 in a game.

You calculate the average life lost to be 6.

But the standard deviation is high, then this average is unreliable?

Thanks!

The standard deviation being high would reduce your probability of being spot on the mean, but if you had enough repeats, your mean will be the correct 'best guess' for you to make predictions based off of.

Do you have a particular context in mind? Standard deviation is simply a measure of how far the average result is from the mean (I'm not sure that it literally work out to be that, but that's what it is describing).

0

reply

(Original post by

What do you mean by "high"?

The average isn't "unreliable" - the average just tells you one piece of information which may or may not be useful (in technical jargon it's an example of the "central tendency" of the data, something that tells you what a "typical" value should look like).

The standard deviation is one measure of "spread" or "dispersion" of your data, so is telling you something different about the data.

The two measures don't contradict each other; they give you different pieces of information!

**davros**)What do you mean by "high"?

The average isn't "unreliable" - the average just tells you one piece of information which may or may not be useful (in technical jargon it's an example of the "central tendency" of the data, something that tells you what a "typical" value should look like).

The standard deviation is one measure of "spread" or "dispersion" of your data, so is telling you something different about the data.

The two measures don't contradict each other; they give you different pieces of information!

(Original post by

From a sciencey perspective, how reliable the average is would probably only be related to the number of repeats you did.

The standard deviation being high would reduce your probability of being spot on the mean, but if you had enough repeats, your mean will be the correct 'best guess' for you to make predictions based off of.

Do you have a particular context in mind? Standard deviation is simply a measure of how far the average result is from the mean (I'm not sure that it literally work out to be that, but that's what it is describing).

**lerjj**)From a sciencey perspective, how reliable the average is would probably only be related to the number of repeats you did.

The standard deviation being high would reduce your probability of being spot on the mean, but if you had enough repeats, your mean will be the correct 'best guess' for you to make predictions based off of.

Do you have a particular context in mind? Standard deviation is simply a measure of how far the average result is from the mean (I'm not sure that it literally work out to be that, but that's what it is describing).

For example speeds of 20000 cars on a 90mph motorway in 7days, the average is calculated to be 93mph from all those data.

Some cars will be greater than this speed and some slower than this speed.

I thought the standard deviation was much spread there is for a set of data about the average.

If the spread of data is low then wouldn't 93mph be an accurate estimation on the actual value?

Thanks!

0

reply

Report

#5

(Original post by

Well im confused then, because i thought average is a measure of an

For example speeds of 20000 cars on a 90mph motorway in 7days, the average is calculated to be 93mph from all those data.

Some cars will be greater than this speed and some slower than this speed.

I thought the standard deviation was much spread there is for a set of data about the average.

If the spread of data is low then wouldn't 93mph be an accurate estimation on the actual value?

Thanks!

**Zenarthra**)Well im confused then, because i thought average is a measure of an

**exact value**.For example speeds of 20000 cars on a 90mph motorway in 7days, the average is calculated to be 93mph from all those data.

Some cars will be greater than this speed and some slower than this speed.

I thought the standard deviation was much spread there is for a set of data about the average.

If the spread of data is low then wouldn't 93mph be an accurate estimation on the actual value?

Thanks!

An "average" (there are several types!) is just a single number (or possibly category) that is supposed to represent your data in some sense. Unless all your data points are identical then there is no way that one single number can represent all of them! Also, for categorical data, e.g. eye colour, you can't even calculate a standard deviation.

There is another concept in statistics which is about assuming your data come from an underlying population that has a particular set of parameters (e.g. mean and s.d.) and then testing how good an estimate of those parameters you can get from a sample of data - is that what you were thinking of?

0

reply

(Original post by

What do you mean by "an exact value"?

An "average" (there are several types!) is just a single number (or possibly category) that is supposed to represent your data in some sense. Unless all your data points are identical then there is no way that one single number can represent all of them! Also, for categorical data, e.g. eye colour, you can't even calculate a standard deviation.

There is another concept in statistics which is about assuming your data come from an underlying population that has a particular set of parameters (e.g. mean and s.d.) and then testing how good an estimate of those parameters you can get from a sample of data - is that what you were thinking of?

**davros**)What do you mean by "an exact value"?

An "average" (there are several types!) is just a single number (or possibly category) that is supposed to represent your data in some sense. Unless all your data points are identical then there is no way that one single number can represent all of them! Also, for categorical data, e.g. eye colour, you can't even calculate a standard deviation.

There is another concept in statistics which is about assuming your data come from an underlying population that has a particular set of parameters (e.g. mean and s.d.) and then testing how good an estimate of those parameters you can get from a sample of data - is that what you were thinking of?

Yeah could you elaborate on the second paragraphs please, like an example?

0

reply

Report

#7

(Original post by

I thought about averages when making an experiment, but nvm.

Yeah could you elaborate on the second paragraphs please, like an example?

**Zenarthra**)I thought about averages when making an experiment, but nvm.

Yeah could you elaborate on the second paragraphs please, like an example?

0

reply

Report

#8

(Original post by

Well im confused then, because i thought average is a measure of an exact value.

For example speeds of 20000 cars on a 90mph motorway in 7days, the average is calculated to be 93mph from all those data.

Some cars will be greater than this speed and some slower than this speed.

I thought the standard deviation was much spread there is for a set of data about the average.

If the spread of data is low then wouldn't 93mph be an accurate estimation on the actual value?

Thanks!

**Zenarthra**)Well im confused then, because i thought average is a measure of an exact value.

For example speeds of 20000 cars on a 90mph motorway in 7days, the average is calculated to be 93mph from all those data.

Some cars will be greater than this speed and some slower than this speed.

I thought the standard deviation was much spread there is for a set of data about the average.

If the spread of data is low then wouldn't 93mph be an accurate estimation on the actual value?

Thanks!

But there was a lot of spread.

What value are you going to bet on? It's still going to be 93mph, because that minimises the amount off you are likely to be because 93 is the average value. The spread tells you how badly you are going to do if you take the optimal straegy, the average gives you that optimal strategy.

So you could say that a higher standard deviation means a less reliable data (more random), but that your mean is as accurate.

Hope that helps.

1

reply

X

Page 1 of 1

Go to first unread

Skip to page:

### Quick Reply

Back

to top

to top