Combining Random Variables

https://isaacphysics.org/questions/combining_random_variables?board=652c2446-1ff1-41c7-894a-abe140f545bd&stage=further_a

for part a, I tried to implement the hint given but I dont really know how to work from here - any help greatly appreciated.
Working:
Its a linear transformation, so
S = 3/4 R + 1/2
or
R = ...
Take expectations (mean value)
E(R) = ....E(S)....
what will happen if you add/subtract .... and what will happen if you multiply by ...?

You can derive it from the basic definition, as per hint 2 and your work. You want E(R) so write that down and express in s_n then split up the sum into the s_n and the constant part and pull the multiplier outside the s_n part. Then reason about the probabilitiies.

For the last part, youre flipping the randon variable and it would be clearer to write
p(R = r_n) and p(S = s_n) say.
(edited 1 month ago)
Oh I see so its just a linear relationship between the 2

so for working out E(R^2) I had a couple of ideas - squaring it wouldnt work since var(r) would then = 0, I thought about substituting E(S^2) as (-2)^2 =4, but that didnt work either.

I am also not sure why variance does not follow a linear relationship either - I initially just thought it would be 4/3*Var(S) (since adding or subtracting doesnt change it) but this wasnt right either
(edited 1 month ago)
Original post by mosaurlodon
Oh I see so its just a linear relationship between the 2

so for working out E(R^2) I had a couple of ideas - squaring it wouldnt work since var(r) would then = 0, I thought about substituting E(S^2) as (-2)^2 =4, but that didnt work either.
I am also not sure why variance does not follow a linear relationship either - I initially just thought it would be 4/3*Var(S) (since adding or subtracting doesnt change it) but this wasnt right either

For the expectation of E(R) it is just a linear relationship. It makes sense, if you double the value of the random variable, youll double the mean. If you add 1 to the random variable, youll add one to the mean. But it would be good to be able to prove it properly.

For the var(R) though, if you add one to the random variable how is the (squared) width affected (youre correct in the previous post about being unchanged)? If you double the random variable, how is the (squared) width affected? Multiplying by the gain cant be correct as var(R) is positive, but the gainn could be negative. Again, having the insight is important, as is being able to prove it properly.
(edited 1 month ago)
Ok I think I managed to somewhat prove the result?

For E(R^2) im assuming adding/subtracting does nothing, but multiplying by a value k, makes the width increase by a factor k^2 (as a sidenote is this why the notation is represented as sigma^2?)

edit: I just got it! I just did Var(S)*(4/3)^2 = 16, which was correct, but im confused as to why the hint said to use Var(X)=E(X^2)−(E(X))^2 - isnt this more complicated/impossible to work out
(edited 1 month ago)
Original post by mosaurlodon
Ok I think I managed to somewhat prove the result?

For E(R^2) im assuming adding/subtracting does nothing, but multiplying by a value k, makes the width increase by a factor k^2 (as a sidenote is this why the notation is represented as sigma^2?)
edit: I just got it! I just did Var(S)*(4/3)^2 = 16, which was correct, but im confused as to why the hint said to use Var(X)=E(X^2)−(E(X))^2 - isnt this more complicated/impossible to work out

Im presuming youre asking about b)? If so, the hint suggests use
var(R) = E(R^2) - (E(R))^2
var(S) = E(S^2) - (E(S))^2
its a well known result and easy to work out from the usual variance
var(R) = E((R-mu)^2)
for instance.

The hint suggests working out E(S^2) from the given info as you have the mean (squared) and the variance. Then use that to get var(R) by
var(R) = E(R^2) - (E(R))^2
You know (E(R))^2 (part a) and by squaring the linear relationship between R and S you can get E(R^2) so then var(R).

However, the quicker way if you know the result is just to use
var(R) = gain^2 var(S)

As for part a) its good to try and prove (google the proofs) for these results.
Thank you!
I managed to do the rest of the question as its relatively straightforward when you have a foothold - I will make sure to watch the proofs as well.

As a final question, for the second method to work out part b, I got E(S^2) as 13, and so did 4^2/3^2 * 13 and then added (-2)^2 = +4, which got me 244/9 which ended up giving me the correct variance, but I still dont really get how you 'square' a linear relationship

if you rewrite S = 3/4 R + 1/2
as 4S/3 -2/3 = R
then how would you go about 'squaring' this relationship and what is the specific order that you do it?

e.g. why is it (4^2 *S)/(3^2) +4
and not (4^2 * S +4)/3^2
(edited 1 month ago)
@mosaurlodon The variance is a measure of how "spread out" the distribution is. Adding a constant doesn't change how spread out it is.

That's the "intuitive" reason why you ignore the constant.

You can also prove it mathematically; either by the linearity of the expectation operator (i.e. that E[aX+b] = aE[X]+b) or go all the way back to first principles using the summation operator.
Original post by mosaurlodon
Thank you!
I managed to do the rest of the question as its relatively straightforward when you have a foothold - I will make sure to watch the proofs as well.
As a final question, for the second method to work out part b, I got E(S^2) as 13, and so did 4^2/3^2 * 13 and then added (-2)^2 = +4, which got me 244/9 which ended up giving me the correct variance, but I still dont really get how you 'square' a linear relationship
if you rewrite S = 3/4 R + 1/2
as 4S/3 -2/3 = R
then how would you go about 'squaring' this relationship and what is the specific order that you do it?
e.g. why is it (4^2 *S)/(3^2) +4
and not (4^2 * S +4)/3^2

E(R^2) = E((aS+b)^2) = E(a^2 S^2 + 2abS + b^2) = a^2 E(S^2) + 2ab E(S) + b^2 ...
oh yeah that makes sooooo much more sense
Again, thank you very much