The Student Room Group

Transforming normal distribution

Just had a question about transformations of normal distributions. I understand that when transforming X ~ N(mean, variance^2) using y=ax+b, you find the new value for the mean by just subbing in for x and using whatever a and b are. But I dont understand how to transform the variance values? Any help would be much appreciated.
Reply 1
Original post by Hejavi431
Just had a question about transformations of normal distributions. I understand that when transforming X ~ N(mean, variance^2) using y=ax+b, you find the new value for the mean by just subbing in for x and using whatever a and b are. But I dont understand how to transform the variance values? Any help would be much appreciated.

the variance is not affected by the value of b, as this just shifts the distribution left or right

the variance is transformed by a factor of a2
Reply 2
Original post by aranon
the variance is not affected by the value of b, as this just shifts the distribution left or right
the variance is transformed by a factor of a2

Why a^2 and not just a?
Reply 3
Original post by Hejavi431
Why a^2 and not just a?

If you think about the formula for variance its basically x^2, so if you scale x by a, then the variance will be scaled by a^2. The standard deviation, square root of the variance, will be scaled by a, hence why that statistic is more useful for understanding the data.

Btw in the OP it should be
N(mean, variance)
or
N(mean, std dev^2)
not variance^2.
(edited 6 months ago)

Quick Reply