Page 88 - Basic Statistics
P. 88

83






                       SS(Total)uncorr = SS(Model) + SS(Res)                                        ( R6 )

                            Total sum of squares is partitioned over the explained sum of squares (SS

                     (Model)) and the unexplained sum of squares (SS (Res)). The sum of squares on

                     both  sides  of  this  partition  is  the  sum  of  the  squares  that  have  not  been
                     corrected.  This  partition  can  then  be  made  into  a  sum  of  squares  corrected

                     partition. Corrections were made on both sides of the equation by a correction

                               2
                     factor nY  .

                                                                      ˆ
                           Y j – n Y    = (  Y   - n Y  ) +   ( Yj-Y )
                                      2
                                                         2
                               2
                                                ˆ 2
                                                                          2
                                                                        j
                                                 j
                          SS(Total)      = SS(Reg) + SS(Res)                                        ( R7 )

                                             ˆ
                     If the estimated value β  is used, the sum of squares regression on this partition
                                              1
                     can be written as follows.

                          SS(Total)      = SS(Reg) + SS(Res)

                                                                      ˆ
                                      2
                                                         2
                                                ˆ 2
                                                                          2
                               2
                           Y j – n Y    = (  Y   - n Y  ) +   ( Yj-Y )
                                                 j
                                                                        j
                                                                         ˆ
                                      2
                                                           2
                                             ˆ 2
                           Y j – n Y    = ( β   (Xj -  X )   ) +   ( Yj- Y )  2                  ( R8 )
                               2
                                                                          j
                                              1

                            Degrees of freedom associated with the sum of the squares is determined
                     by the sample size and the number of parameters in the model (p). Each degree
                     of freedom of the corrected sum of squares is always reduced by 1 as a result of
                     correction factors. Degrees of freedom associated with SS(total) is n - 1. Degrees
                     of freedom associated with SS(Reg) is the degrees of freedom of the SS(model)
                     minus  one.  Degrees  of  freedom  associated  with  SS(Model)  is  equal  to  the

                     number of parameters in the regression model, that is p = 2. Thus the degrees of

                     freedom  associated  with  SS(Reg)  is  p-1  =  2-1.  Degrees  of  freedom  associated
                     with SS(Res) is the n-p = n-2.









                                         ~~* CHAPTER 5   LINEAR REGRESSION MODEL *~~
   83   84   85   86   87   88   89   90   91   92   93