Takng the average mean & var of two sample distributions Watch

Bugzy
Badges: 10
Rep:
?
#1
Report Thread starter 8 years ago
#1
Say one distribution, n = 50, mean = 2.5, standard dev = 1.5
second distribution, n = 25, mean = 3.0, standard dev = 1.8

I need to find a mean of the two distributions. So the mean of the mean is (2.5+3)/2 = 2.75

But how would I do this for the standard dev?

(1.5^2/rt 50 + 1.8^2/rt 25)/2 to find the average variance?
0
reply
the bear
Badges: 20
Rep:
?
#2
Report 8 years ago
#2
i don't think they want you to take the average of the two means ( this only works if the two groups have equal numbers of items ).

instead you find the two group totals... 2.5 * 50 and 3.0 * 25... add them together to get the grand total .... then divide by the total number of items in the two groups.

for the standard deviations you have to figure out the two values of \Sigma x^2 from the standard deviation formula... then add them together & put the result back in, together with the appropriate values of \Sigma x and n

the bear

0
reply
X

Quick Reply

Attached files
Write a reply...
Reply
new posts
Latest
My Feed

See more of what you like on
The Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

Personalise

University open days

  • University of Derby
    Postgraduate and Professional Open Evening - Derby Campus Postgraduate
    Tue, 22 Jan '19
  • University of the West of England, Bristol
    Undergraduate Open Afternoon - Frenchay Campus Undergraduate
    Wed, 23 Jan '19
  • University of East London
    Postgraduate Open Evening Postgraduate
    Wed, 23 Jan '19

Brexit: Given the chance now, would you vote leave or remain?

Remain (1582)
79.3%
Leave (413)
20.7%

Watched Threads

View All
Latest
My Feed

See more of what you like on
The Student Room

You can personalise what you see on TSR. Tell us a little about yourself to get started.

Personalise