# Will the s.d Increase or Decrease? (S1 edexcel)

if two new values X & Y are added, where the difference betweem X and mean is 2 times greater than the s.d and the difference between Y and mean is 1 time less than the s.d, will the s.d decrease or increase overall?
Original post by Aleksander Krol
if two new values X & Y are added, where the difference betweem X and mean is 2 times greater than the s.d and the difference between Y and mean is 1 time less than the s.d, will the s.d decrease or increase overall?

What is the definiton of the standard deviation (or variance) in terms of the expected difference from the mean?
(edited 1 year ago)
Original post by mqb2766
What is the definiton of the standard deviation (or variance) in terms of the expected difference from the mean?

s.d tells us the avg. of how far each datas are from the mean, right? variance tells us about how much variations are there in the data?
Original post by Aleksander Krol
s.d tells us the avg. of how far each datas are from the mean, right? variance tells us about how much variations are there in the data?

Variance is just the standard deviation squared, or standard deviation is just the variance square rooted. So theyre both basically the same thing. Its easier to interpret standard deviation in terms of scaling the data, but its easier to analyse the variance. The thing I was expecting you to say was something like
variance = average ((x - mean)^2)
so its the expected or average of the squared distances from the mean. Obviously if the standard deviation increases or decreases, so does the variance.

So if you add in a new data point which has the same value as the variance, how does the variance change? You could think about of it as
n * variance = Sum (x - mean)^2
where you have n data points. Now if you add in a new data point which is 4 times the average squared distance from the mean, how does it change?

You could put in some simple numbers if you cant make the argument clear.
(edited 1 year ago)
Original post by mqb2766
Variance is just the standard deviation squared, or standard deviation is just the variance square rooted. So theyre both basically the same thing. Its easier to interpret standard deviation in terms of scaling the data, but its easier to analyse the variance. The thing I was expecting you to say was something like
variance = average ((x - mean)^2)
so its the expected or average of the squared distances from the mean. Obviously if the standard deviation increases or decreases, so does the variance.

So if you add in a new data point which has the same value as the variance, how does the variance change? You could think about of it as
n * variance = Sum (x - mean)^2
where you have n data points. Now if you add in a new data point which is 4 times the average squared distance from the mean, how does it change?

You could put in some simple numbers if you cant make the argument clear.

thank for trying to help me out
Original post by Aleksander Krol
if two new values X & Y are added, where the difference betweem X and mean is 2 times greater than the s.d and the difference between Y and mean is 1 time less than the s.d, will the s.d decrease or increase overall?

Maybe increase bc the overall effect is for the standard deviation to increase bc x deviates above the mean more than y gets closer to the standard deviation so the standard deviation gets larger as the overall effect is for data to move further from the mean.
Original post by Marissa48
Maybe increase bc the overall effect is for the standard deviation to increase bc x deviates above the mean more than y gets closer to the standard deviation so the standard deviation gets larger as the overall effect is for data to move further from the mean.

I mean X deviates above the mean more than Y gets closer to the mean (as in y gets closer to the mean that one standard deviation). I hope that makes sense.
Original post by Marissa48
I mean X deviates above the mean more than Y gets closer to the mean (as in y gets closer to the mean that one standard deviation). I hope that makes sense.

thanks a bunch for your reply!! this was my honest thoughts too, but i wanted to still get it confirmed.
Original post by Aleksander Krol
thanks a bunch for your reply!! this was my honest thoughts too, but i wanted to still get it confirmed.

Youre not really saying what you understand, but if youre happy with the formula
n*sigma^2= Sum (x - mean)^2
where sigma^2 is the current variance, then adding in a new data point such that (x-mean)^2 = sigma^2, will mean that the variance is unchanged as n -> n+1 and the sum simply increases by sigma^2. If the new data point is such that (x-mean)^2 > sigma^2, then the new variance will increase. Similarly if (x-mean)^2 < sigma^2, then the new variance will decrease. In your question, you were told that (x-mean)^2 = 4 sigma^2, so it goes up.