The Student Room Group

Don't understand this likelihood ratio question (Statistics)

Hey guys. I have a question about a Beta(a,a) distribution. It's to do with the abilities of students and estimating the true value. We have observed the ability of m students, theta(w) = (theta1(w), theta2(w).....thetam(w)) are our observations.

The question asks me to find the likelihood function and then asks a question about whether a given statistic is sufficient or not. However I am stuck on the third part which says:

c) Construct the likelihood ratio W for Θ0 = {1}, Θ1 = { 0.5 , 2}.

I have no idea what to do here, any ideas?
Original post by pineapplechemist
Hey guys. I have a question about a Beta(a,a) distribution. It's to do with the abilities of students and estimating the true value. We have observed the ability of m students, theta(w) = (theta1(w), theta2(w).....thetam(w)) are our observations.

The question asks me to find the likelihood function and then asks a question about whether a given statistic is sufficient or not. However I am stuck on the third part which says:

c) Construct the likelihood ratio W for Θ0 = {1}, Θ1 = { 0.5 , 2}.

I have no idea what to do here, any ideas?


So presumably you are able to write down the likelihood function as a function of θ\theta? Now simply write down the definition of the likelihood ratio (that is, the formula that involves taking suprema on top and bottom over the relevant parameter values). Take the required suprema and you are home.
Original post by Gregorius
So presumably you are able to write down the likelihood function as a function of θ\theta? Now simply write down the definition of the likelihood ratio (that is, the formula that involves taking suprema on top and bottom over the relevant parameter values). Take the required suprema and you are home.


My likelihood is (a-1)(sum from i=1 to M)log(thetai) + (a-1)(sum from i=1 to M)log(1-thetai) - M log B(a,a). B is the beta function: is this correct?

I know the formula but I'm confused specifically by the parameter spaces given. What does it mean to have a paramter space of 0.5 and 2? Am I taking the supremum of all possible values given by having a Beta(0.5,0.5) distribution or a Beta(2,2) distribution?
Original post by pineapplechemist
My likelihood is (a-1)(sum from i=1 to M)log(thetai) + (a-1)(sum from i=1 to M)log(1-thetai) - M log B(a,a). B is the beta function: is this correct?

I know the formula but I'm confused specifically by the parameter spaces given. What does it mean to have a paramter space of 0.5 and 2? Am I taking the supremum of all possible values given by having a Beta(0.5,0.5) distribution or a Beta(2,2) distribution?


You've got the log likelihood there - if you continue with that, you'll need to subtract rather than divide.

The supremum is a supremum over θ\theta in the parameter space. One of the parameter spaces has a single element, so that is easy, just set θ=1\theta = 1. For the other one, you have two possibilities for θ\theta. Which one maximizes the likelihood?
Original post by Gregorius
You've got the log likelihood there - if you continue with that, you'll need to subtract rather than divide.

The supremum is a supremum over θ\theta in the parameter space. One of the parameter spaces has a single element, so that is easy, just set θ=1\theta = 1. For the other one, you have two possibilities for θ\theta. Which one maximizes the likelihood?


For the Null parameter space do I literally sub 1 into my p.d.f? This gives me a value of 1. What about for the other parameter space with 1/2 and 2? Do I need to find the MLE or do I simply sub both in to my p.d.f and choose the bigger one? Sorry, I really don't understand this very well.
Original post by pineapplechemist
For the Null parameter space do I literally sub 1 into my p.d.f? This gives me a value of 1. What about for the other parameter space with 1/2 and 2? Do I need to find the MLE or do I simply sub both in to my p.d.f and choose the bigger one? Sorry, I really don't understand this very well.


You are going along the right lines - it is a matter of substituting the parameter values into the likelihood function. The likelihood is

Unparseable latex formula:

\displaystyle L(\alpha) = \left[\frac{\Gamma(\2 \alpha)}{\Gamma(\alpha)^2}\right]^n \prod_{i=1}^{n} \theta_{i}^{\alpha-1} (1-\theta_{i})^{\alpha-1}



If you plug α=1\alpha = 1 you will get a value of one for the likelihood. What do you get if you set α=2\alpha = 2 then α=0.5\alpha = 0.5 in turn?
Reply 6
Original post by Gregorius
You are going along the right lines - it is a matter of substituting the parameter values into the likelihood function. The likelihood is

Unparseable latex formula:

\displaystyle L(\alpha) = \left[\frac{\Gamma(\2 \alpha)}{\Gamma(\alpha)^2}\right]^n \prod_{i=1}^{n} \theta_{i}^{\alpha-1} (1-\theta_{i})^{\alpha-1}



If you plug α=1\alpha = 1 you will get a value of one for the likelihood. What do you get if you set α=2\alpha = 2 then α=0.5\alpha = 0.5 in turn?


But the value of the likelihood function for alpha = 2 and alpha = 1/2 is dependent on the values of theta_i ?
For example when theta_i are all close to 0 (or 1) then the supremum of the likelihood is when alpha=1/2. When the theta_1 are near 1/2 then we should take alpha=2
So how do we take supremum
(edited 8 years ago)
Original post by Namch
I get this


That looks the right sort of thing. If you set

κ=i=1mθi(1θi)\displaystyle \kappa = \prod_{i=1}^{m} \theta_i (1 - \theta_i)

then

L(2)=6mκ\displaystyle L(2) = 6^m \kappa

and

L(1/2)=1πm1κ\displaystyle L(1/2) = \frac{1}{\pi^m} \frac{1}{\sqrt{\kappa}}

The decision as to which is bigger simply then depends upon whether κ3/2\kappa^{3/2} is bigger or smaller than 1πm6m\frac{1}{\pi^m 6^m}
Reply 9
Woah when 1/2 is the argument of the gamma function we have sqrt pi. I didnt know :smile:
(edited 8 years ago)
Original post by Namch
Woah when 1/2 is the argument of the gamma function we have sqrt pi. I didnt know :smile:


Yes, Γ(1/2)=π\Gamma(1/2) = \sqrt{\pi}

Quick Reply

Latest