Page 111 - Basic Statistics
P. 111
106
U=AY. Furthermore, the definition of variance-covariance matrix for the
random vector Y :
Var (Y) = E( [ Y - E(Y)][ Y - E(Y)]’ )
Matrix multiplication results [ Y - E(Y)][ Y - E(Y)]’ size n x n, with main diagonal
2
elements ( Yi - E(Yi) ) and the nondiagonal elements ( Yi - E(Yi))(Yj - E(Yj))’.
Expectation value for the two groups of consecutive elements are the variance
covariance.
If the definition of the variance-covariance matrix is applied to U = A Y,
then
Var (U) = E( [U - E(U)][ U - E(U)]’ )
= E( [A Y - E(A Y)][ A Y - E(A Y)]’ )
= E(A [ Y - E(Y)][ Y - E(Y)]’ A’ )
= A E([ Y - E(Y)][ Y - E(Y)]’) A’
= A [Var (Y) ] A’ (G.8)
2
For Var (Y) = I ,
2
Var (U) = A [I ] A’
2
= A A’ (G.9)
Description: Elements of a diagonal matrix AA’ is the sum of the squares of the
coefficients of i-th linear function, then the results perkaliannya with is
2
variance a i-th linear function. Nondiagonal elements (i, j) is the cross product
between the coefficients of the linear function, then the result of multiplying it
2
by is covariance between the two linear functions.
ˆ
ˆ
Variance of each statistic, β , Y , and e :
ˆ
β = [( X’X ) ( X’)] Y , so that
-1
ˆ
Var( β ) = [( X’X ) ( X’)] [Var (Y) ] [( X’X ) ( X’)]’
-1
-1
-1
= ( X’X ) ( X’ X )( X’X )
-1
2
-1
2
= ( X’X ) (G.10)
~~* CHAPTER 5 THE MULTIPLE LINEAR REGRESSION MODEL *~~