# OCR S2 Standardising normal distributionsWatch

#1
I understand that Z is a normal distribution with Mean 0, and variance 1. You can apparently convert any normally distributed variable into Z by subtracting the mean, and then dividing by the standard deviation.

I understand that if the mean is subtracted, then the bell curve now has mean 0, but I don't quite understand why you then divide by the SD. I know how to do the questions, I just don't understand why I'm using the method that I am.

So yeah if anyone could explain it that would be great =)
0
#2
buuuump
0
7 years ago
#3
(Original post by jamie092)
buuuump
You're basically doing a graph transformation. Starting from a normal distribution. Subtracting the mean is moving it to the left or right; dividing by the s.d. is scaling the x-axis. When you do that you end up with the standardized normal distribution.

If you happen to have cumulative normal distribution tables for your particular normal distribution you don't need to do the transformation. BUT since the normal distribution cannot be easily integrated you'd have to use numerical methods to work out the values each time; bit of a bore. Hence standardize and use the tables provided.
X

new posts
Latest
My Feed

### Oops, nobody has postedin the last few hours.

Why not re-start the conversation?

see more

### See more of what you like onThe Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

### University open days

• Cranfield University
Cranfield Forensic MSc Programme Open Day Postgraduate
Thu, 25 Apr '19
• University of the Arts London
Open day: MA Footwear and MA Fashion Artefact Postgraduate
Thu, 25 Apr '19
• Cardiff Metropolitan University
Sat, 27 Apr '19

### Poll

Join the discussion

#### Have you registered to vote?

Yes! (124)
39.62%
No - but I will (17)
5.43%
No - I don't want to (20)
6.39%
No - I can't vote (<18, not in UK, etc) (152)
48.56%