Page 115 - Basic Statistics
P. 115
110
ˆ
SS() = β ’ ( X*)’ Y = (1/n) (1’Y)’ ( 1’Y)
= (1/n) Y’ ( 1 1’ )Y (G.17)
because of 1’Y = Y , then SS() = n Y .
2
i
If we let 1 1’ = J, with J is a matrix of size (nxn) with all elements 1, then
SS(Reg) = SS(Model) – SS()
= Y’ H Y - Y’ (J/n)Y
= Y’ ( H - J/n )Y (G.18)
Degrees of freedom for SS() is 1, so the degrees of freedom for SS (Reg) is p.
Partition sum of the squares on multiple linear regression is shown by
the following table.
Table 5.8 Analysis of variance summary for regression analysis.
Source of Degrees Sum of Squares Computational
variation of Formula Formula
Fredom
2
Totalcorrt n-1 Y’ ( I – J )Y Y’ Y - n Y
ˆ
Model p +1 Y’ H Y β ’ X’Y
2
Mean 1 (1/n) Y’ (1 1’) Y n Y
ˆ
2
Regression p Y’ [H – J/n] Y β ’ X’Y - n Y
ˆ
Residual n-(p+1) Y’ [I - H] Y Y’ Y - β ’ X’Y.
To test the significance of the regression model, or whether a group of
independent variables can provide information on the variation of Y around the
middle value, formulated the following hypothesis test.
H0 : 1 = 2 = … = p = 0
H1 : j 0 , For the smallest one of the j. j = 1, 2, … p (G.19)
~~* CHAPTER 5 THE MULTIPLE LINEAR REGRESSION MODEL *~~