The Student Room Group

Distributions of Normal Random Variables

Hi guys,

I'm unsure about how to go about these questions, I've got that the first is chi-squared with 5 degrees of freedom but unsure how to do the rest; would anyone be able to give me tips? Thanks!

EDIT: I've got now that the U is distributed by (sigma^2)*chi-squared with 4 degrees of freedom, unsure if it's correct however.
(edited 8 years ago)
Original post by r3l3ntl3ss
Hi guys,

I'm unsure about how to go about these questions, I've got that the first is chi-squared with 5 degrees of freedom but unsure how to do the rest; would anyone be able to give me tips? Thanks!

EDIT: I've got now that the U is distributed by (sigma^2)*chi-squared with 4 degrees of freedom, unsure if it's correct however.


You're fine on the first part of the first question. For the second part, somewhere in your notes you should have a proof that, for normal variates, the mean and sample variance are independent. I hope your notes will go on to give you the distribution of the sample mean and variance. The second part of the question will follow on from that. The third part will follow on nicely from that if you remember how Y62 Y_{6}^{2} is distributed.

The second question should, again I hope, follow from stuff you have in your notes. What are the definitions of the t distribution and of the F distribution, for example?
Reply 2
Original post by Gregorius
You're fine on the first part of the first question. For the second part, somewhere in your notes you should have a proof that, for normal variates, the mean and sample variance are independent. I hope your notes will go on to give you the distribution of the sample mean and variance. The second part of the question will follow on from that. The third part will follow on nicely from that if you remember how Y62 Y_{6}^{2} is distributed.

The second question should, again I hope, follow from stuff you have in your notes. What are the definitions of the t distribution and of the F distribution, for example?


Strangely we haven't done the F-distribution yet :s-smilie:

Part (b) of the first question I just rearranged and got the sum of Y_i^2 - 5Ȳ^2 which by definition is chi-squared with 5 degrees of freedom minus chi-squared with 1 degree of freedom which is chi-squared with 4 degrees of freedom by the additive property.

Part (c) I just got chi-squared with 4 DOF + chi-squared with 1 DOF which gives chi-squared with 5 DoF, is that correct?
Original post by r3l3ntl3ss
Strangely we haven't done the F-distribution yet :s-smilie:

Part (b) of the first question I just rearranged and got the sum of Y_i^2 - 5Ȳ^2 which by definition is chi-squared with 5 degrees of freedom minus chi-squared with 1 degree of freedom which is chi-squared with 4 degrees of freedom by the additive property.

Part (c) I just got chi-squared with 4 DOF + chi-squared with 1 DOF which gives chi-squared with 5 DoF, is that correct?


Part (c) looks OK, but you're going to have to be more careful for part (b). For example, if Z1,Z2 Z_1, Z_2 are both χ2(1)\chi^{2}(1) do you think that Z1Z2Z_1 - Z_2 is χ2(2)\chi^{2}(2) ?

You need to write U as a sum of squares of four independent variables - hence my reference to the proof of the theorem that says that mean and sample variance are independent; that manipulation usually appears in that. See section 4:10 of Grimmett and Stirzaker, for example.
Reply 4
Original post by Gregorius
Part (c) looks OK, but you're going to have to be more careful for part (b). For example, if Z1,Z2 Z_1, Z_2 are both χ2(1)\chi^{2}(1) do you think that Z1Z2Z_1 - Z_2 is χ2(2)\chi^{2}(2) ?

You need to write U as a sum of squares of four independent variables - hence my reference to the proof of the theorem that says that mean and sample variance are independent; that manipulation usually appears in that. See section 4:10 of Grimmett and Stirzaker, for example.


Ah I see what you mean, I think I've finally got it now! Thanks for all your help :smile:
Reply 5
Original post by Gregorius
Part (c) looks OK, but you're going to have to be more careful for part (b). For example, if Z1,Z2 Z_1, Z_2 are both χ2(1)\chi^{2}(1) do you think that Z1Z2Z_1 - Z_2 is χ2(2)\chi^{2}(2) ?

You need to write U as a sum of squares of four independent variables - hence my reference to the proof of the theorem that says that mean and sample variance are independent; that manipulation usually appears in that. See section 4:10 of Grimmett and Stirzaker, for example.


I've got another question if you don't mind me asking.

The question is the one in the first attachment. I used Chebyshev's inequality and got g1 = -sqrt(10)/3 sigma, g2 = sqrt(10)/3 sigma. I'm unsure if that's correct - if it isn't, can you point me in the right direction? Thanks :smile:
Original post by r3l3ntl3ss
I've got another question if you don't mind me asking.

The question is the one in the first attachment. I used Chebyshev's inequality and got g1 = -sqrt(10)/3 sigma, g2 = sqrt(10)/3 sigma. I'm unsure if that's correct - if it isn't, can you point me in the right direction? Thanks :smile:


Can you show your working; I'm not sure where those 3's have come from!

But another problem is that Chebychev applies when you know the population standard deviation. There are versions that apply when you have to estimate it from the sample - but have you come across those in your course?
Reply 7
Original post by Gregorius
Can you show your working; I'm not sure where those 3's have come from!

But another problem is that Chebychev applies when you know the population standard deviation. There are versions that apply when you have to estimate it from the sample - but have you come across those in your course?


I haven't come across that yet no; I've attached my work for the question
Original post by r3l3ntl3ss
I haven't come across that yet no; I've attached my work for the question


Yes, that looks OK to me (provided you're allowed to use that form of Chebychev).

Quick Reply

Latest