Page 150 - DMTH404_STATISTICS
P. 150
Statistics
Notes
X ¯ \Y ® 0 1 Total
1 1
0 0
4 4
1 1 2
1
4 4 4
1 1
2 0
4 4
2 2
Total 1
4 4
The conditional distribution of X when Y = 0, is given by
X 0 1 2 Total
1 1
P
X
( /Y = 0) 0 1
2 2
Also, the marginal distribution of X, is given by
X 0 1 2 Total
1 1 1
P 1
i
4 2 4
Since the conditional and the marginal distributions are different, X and Y are not independent
random variables.
10.2.3 Expectation of the Sum or Product of two Random Variables
Theorem 1.
If X and Y are two random variables, then E(X + Y) = E(X) + E(Y).
Proof.
Let the random variable X takes values X , X , ...... X and the random variable Y takes values Y ,
1 2 m 1
Y , ...... Y such that P(X = X and Y = Y) = p (i = 1 to m, j = 1 to n).
2 n i j ij
By definition of expectation, we can write
m n m n m n m n n m
ij å å
( E X Y = åå (X + Y j ) p =åå X p + Y p = X iå p + Y j p ij
i ij åå
j ij å
)
+
ij
i
i= 1 j= 1 i= 1 j= 1 i= 1 j= 1 i= 1 j= 1 i= 1 j= 1
m n æ n m ö
i i å
= å X P + YP j ç p = P i and å p = P j ÷
Here å
j
ij
ij
i= 1 j= 1 è J= 1 i= 1 ø
X
= E ( ) + E ( )
Y
The above result can be generalised. If there are k random variables X , X , ...... X , then
1 2 k
E(X + X + ...... + X ) = E(X ) + E(X ) + ...... E(X ).
1 2 k 1 2 k
Remarks: The above result holds irrespective of whether X , X , ...... X are independent or not.
1 2 k
142 LOVELY PROFESSIONAL UNIVERSITY