| Marcos |
Can we express E(AB) in terms of E(A) and E(B) when A and B aren't independant? So far I've managed to show (although I dunno if they're correct): E(A + B) = E(A) + E(B), for any A and B E(AB) = E(A)E(B), for independant A and B Basically I'm trying to express Var(A + B) in terms of Var(A), Var(B), E(A) and/or E(B) only. I've gotten as far as showing that: Var(A + B) = Var(A) + Var(B) + 2[E(AB) - E(A)E(B)] [This expression reduces to Var(A + B) = Var(A) + Var(B) for independant events] Thanks in advance, Marcos P.S. I'm talking about discrete random variables although I suspect these formulas should be applicable for continuous ones aswell. |
||||||
| David
Loeffler |
There is a quantity called the covariance, which is defined by Cov(X,Y)=E(X Y)- E(X)E(Y). Note that Cov(X,X)=Var(X), and Cov(X,Y)=0 if X, Y are independent (but the converse isn't true.); so the covariance in some sense measures the strength of the relationship between X and Y. We can then define the correlation coefficient
. Exercise: -1 £ r £ 1. David |
||||||
| Tristan
Marshall |
To emphasise to what David's just said, you've found that Var(A+B) = Var(A) + Var(B) + 2Cov(A,B), and this is as far as you can go unless you know more about A and B. |