Estimators of population parameters Watch

Chittesh14
Badges: 19
Rep:
?
#1
Report Thread starter 4 weeks ago
#1
P.S. If anyone helping wants to rewrite some variable and refer to it as another variable to make it easier and avoid the LaTeX, feel free to do so as it gave me a really tough time typing this up, so I'd understand the same problem for others.

1. "Let {X_1, . . . , X_n} be a random sample from a population with mean \mu = E(X_i). Find an estimator of \mu."

For this question, I was wondering why did they specifically say \mu = E(X_i), and also not include what the variance is?
Is this because usually, if {X_1, ..., X_n} is a random sample with a mean of \mu and variance of {\sigma}^2, we assume each object from the population i.e. each (X_i) is independent and identically distributed (as far as I have currently learnt). So, in this case - the questioner doesn't want us to assume that they are independent and identically distributed? The question goes on to talk about the sample mean \bar{X} being a natural estimator for \mu.

I know that question doesn't make a lot of sense without more information provided, but was just wondering if anyone had any idea from previous experience.

2. "The variance of an estimator, denoted Var(\hat{\theta}), is obtained directly from the estimator's sampling distribution.
For the sample mean, \bar{X}, we have Var(\bar{X}) = \frac{\sigma^2}{n}."

Is it correct that this is only for: a population of a random variable X which is normally distributed, or approximately (for nearly every population of a random variable X which has a distribution with a finite variance) by the Central Limit Theorem? I was just wondering, since it wasn't specified and just assumed to be true.

3. Suppose that \hat{\theta} is an estimator of the parameter, \theta. Then, is \hat{\theta} - \theta the error of the estimator, and |\hat{\theta} - \theta| the absolute error of the estimator?
If so, is the bias of an estimator: E(\hat{\theta}) - \theta = E(\hat{\theta}) - E(\theta) (since \theta is a constant), = E(\hat{\theta} - \theta) = the expected value of the error of the estimator?

4. Also, is the mean absolute deviation (MAD) = |\hat{\theta} - \theta| the same as the mean absolute error? - this links to Q3 if the absolute error is what I think it may be?

5. If E(\hat{\theta^2}) < \infty, it holds that: MSE(\hat{\theta}) = Var(\hat{\theta}) + [Bias(\hat{\theta})]^2 where Bias(\hat{\theta}) = E(\hat{\theta}) - \theta.
Why is it a requirement that E(\hat{\theta}^2) < \infty?
Is it because: Var(\hat{\theta}) = E(\hat{\theta}^2) + (E(\hat{\theta}))^2 and hence, if it is less than infinity, then the MSE is less than infinity, which I suppose might be a requirement that it is finite?
Last edited by Chittesh14; 4 weeks ago
0
reply
Chittesh14
Badges: 19
Rep:
?
#2
Report Thread starter 4 weeks ago
#2
6. "Intuitively, MAD is a more appropriate measure for the error in estimation. However, it is technically less convenient since the function h(x) = |x| is not differentiable at x = 0."
How does this link? Is it that if we are differentiating to find the maximum value of MAD, we cannot do so?
0
reply
Gregorius
Badges: 14
Rep:
?
#3
Report 4 weeks ago
#3
(Original post by Chittesh14)

1. "Let {X_1, . . . , X_n} be a random sample from a population with mean \mu = E(X_i). Find an estimator of \mu."

For this question, I was wondering why did they specifically say \mu = E(X_i), and also not include what the variance is?
Why should they? You're interested in estimating the population mean - you can do this without any mention of any other population parameter.

Is this because usually, if {X_1, ..., X_n} is a random sample with a mean of \mu and variance of {\sigma}^2, we assume each object from the population i.e. each (X_i) is independent and identically distributed (as far as I have currently learnt). So, in this case - the questioner doesn't want us to assume that they are independent and identically distributed? The question goes on to talk about the sample mean \bar{X} being a natural estimator for \mu.
No. Provided the population is of infinite size, the very notion of "random sample" implies independent and identically distributed. I think you're simply over-reading things here - the question doesn't mention something here because it's irrelevant.

2. "The variance of an estimator, denoted Var(\hat{\theta}), is obtained directly from the estimator's sampling distribution.
For the sample mean, \bar{X}, we have Var(\bar{X}) = \frac{\sigma^2}{n}."

Is it correct that this is only for: a population of a random variable X which is normally distributed, or approximately (for nearly every population of a random variable X which has a distribution with a finite variance) by the Central Limit Theorem? I was just wondering, since it wasn't specified and just assumed to be true.
No. This is true in general.

3. Suppose that \hat{\theta} is an estimator of the parameter, \theta. Then, is \hat{\theta} - \theta the error of the estimator, and |\hat{\theta} - \theta| the absolute error of the estimator?
If so, is the bias of an estimator: E(\hat{\theta}) - \theta = E(\hat{\theta}) - E(\theta) (since \theta is a constant), = E(\hat{\theta} - \theta) = the expected value of the error of the estimator?
Useful material here.

5. If E(\hat{\theta^2}) < \infty, it holds that: MSE(\hat{\theta}) = Var(\hat{\theta}) + [Bias(\hat{\theta})]^2 where Bias(\hat{\theta}) = E(\hat{\theta}) - \theta.
Why is it a requirement that E(\hat{\theta}^2) < \infty?
Is it because: Var(\hat{\theta}) = E(\hat{\theta}^2) + (E(\hat{\theta}))^2 and hence, if it is less than infinity, then the MSE is less than infinity, which I suppose might be a requirement that it is finite?
Well, how are you going to deal with infinite quantities?
0
reply
Gregorius
Badges: 14
Rep:
?
#4
Report 4 weeks ago
#4
(Original post by Chittesh14)
6. "Intuitively, MAD is a more appropriate measure for the error in estimation. However, it is technically less convenient since the function h(x) = |x| is not differentiable at x = 0."
How does this link? Is it that if we are differentiating to find the maximum value of MAD, we cannot do so?
Yes.
0
reply
Chittesh14
Badges: 19
Rep:
?
#5
Report Thread starter 4 weeks ago
#5
(Original post by Gregorius)
Why should they? You're interested in estimating the population mean - you can do this without any mention of any other population parameter.
OK got it, thank you. So, because I don't need to know about the variance, or any other population parameters - other than the mean, I do not need to know about them, hence they're not mentioned. It is nothing to do with independence nor identically distributed.

No. Provided the population is of infinite size, the very notion of "random sample" implies independent and identically distributed. I think you're simply over-reading things here - the question doesn't mention something here because it's irrelevant.
That helps a lot, thank you - was genuinely confused if every random sample implied IID variables.

No. This is true in general.
OK, got it - made a mistake. If the variables are independent and identically distributed, then those are the values for the mean and the variance. The independence is specifically for the variance, to avoid the covariance between any two variables.
But, in addition to that, if the population is normal, then the sample mean is exactly normally distributed with a mean of \mu and a variance of Var(\bar{X}) = \frac{\sigma^2}{n} and if the population is not normal, then the sample mean is approximately normally distributed with a mean of \mu and a variance of Var(\bar{X}) = \frac{\sigma^2}{n}, provided the sample size n is large.

Useful material here.
Thank you, answers most of my questions .

Well, how are you going to deal with infinite quantities?
Thank you, understood.
Last edited by Chittesh14; 4 weeks ago
0
reply
X

Quick Reply

Attached files
Write a reply...
Reply
new posts
Latest
My Feed

See more of what you like on
The Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

Personalise

University open days

  • Solent University
    Postgraduate and Professional Open Evenings Postgraduate
    Mon, 20 May '19
  • London Metropolitan University
    Postgraduate Mini Open Evening - Holloway Campus Undergraduate
    Tue, 21 May '19
  • Brunel University London
    Postgraduate Open Evening Postgraduate
    Wed, 22 May '19

How has 2019 been so far?

Amazing!!! (39)
5.74%
Fairly positive (226)
33.24%
Just another year... (269)
39.56%
Is it 2020 yet? (146)
21.47%

Watched Threads

View All