| Marcos |
Can we express E(AB) in terms of E(A) and E(B) when A and B aren't independant? So far I've managed to show (although I dunno if they're correct): E(A + B) = E(A) + E(B), for any A and B E(AB) = E(A)E(B), for independant A and B Basically I'm trying to express Var(A + B) in terms of Var(A), Var(B), E(A) and/or E(B) only. I've gotten as far as showing that: Var(A + B) = Var(A) + Var(B) + 2[E(AB) - E(A)E(B)] [This expression reduces to Var(A + B) = Var(A) + Var(B) for independant events] Thanks in advance, Marcos P.S. I'm talking about discrete random variables although I suspect these formulas should be applicable for continuous ones aswell. |
||
| David
Loeffler |
There is a quantity called the covariance, which is defined by . Note that , and if , are independent (but the converse isn't true.); so the covariance in some sense measures the strength of the relationship between and . We can then define the correlation coefficient . Exercise: . David |
||
| Tristan
Marshall |
To emphasise to what David's just said, you've found that Var(A+B) = Var(A) + Var(B) + 2Cov(A,B), and this is as far as you can go unless you know more about A and B. |