Page 121 - Basic Statistics
P. 121
116
if it were adjusted for, say, only variable 1. In other word, the appropriateness of
regressor variable often depend on what regressor variables are in the model with it.
The sum of squares, which is discussed on a partial test, not forming
additives contribute to the SS (Reg),
SS(1 , 2 , … , p 0 ) SS(1 0 , 2 , … ,p ) + SS(2 0 , 1 , 3 , … ,p ) + …
+ SS(j 0 , 1 , … , j-1, j+1, … ,p ) + …+ SS(p 0 , 1 , 2 , … ,p-1 ).
But they form additive partitioning of the SS (Reg), scilicet:
SS(1 , 2 , … , p 0 ) = SS(1 0 ) + SS(2 0 , 1 ) + SS(3 0 , 1 , 2 )
+ … + SS(p 0 , 1 , 2 , … ,p-1 ). (G.24)
The notation SS( . . ) states "regression explained by ...", with the line
vertikal denoting "in the presence of ...". For example, SS(2 0 , 1 ) is an
increase in the regression sum of squares, when the regressor X2 is added to a
model that involving only X1 and the constant term.
The sequential and partial sum of squares for worked example 5.4 are
shown in Table 5.13 follows.
Table 5.13 The sequential and partial sum of squares
Sequential Partial
SS(1 0 ) = 99.145 SS(1 0 , 2 , 3 ,4 ,5) =0.299
SS(2 0 , 1 ) = 0.127 SS(2 0 , 1 , 3 ,4 ,5) =0.869
SS(3 0 , 1 , 2 ) = 4.120 SS(3 0 , 1 , 2 ,4 ,5) =0.078
SS(4 0 , 1 , 2 ,3 ) = 0.263 SS(4 0 , 1 , 2 ,3 ,5) =0.983
SS(5 ,, 1, 2 ,3 ,4) = 4.352 SS(5 0 , 1 , 2 ,3 ,4) =4.352
Partitions of sequential sum of squares is very useful when we need
information about the worth for a subset of regressors. For example, suppose,
the regression model with p = 4 regressors, we can write
~~* CHAPTER 5 THE MULTIPLE LINEAR REGRESSION MODEL *~~