Page 113 - Basic Statistics
P. 113

108




                     5.2.5    PARTITIONING OF THE SUM OF SQUARES

                            To  match  the  linear  additive  model,  the  vector  of  observation  for  the

                                                                      ˆ
                     dependent variable Y, partitioned into vector Y  plus residual vector e. namely

                          ˆ
                     Y = Y + e

                     Similar partitions are used to sum of the squares of the dependent variable Y.

                                       ˆ
                              ˆ
                     Y’ Y  = (Y + e )’ (Y + e )
                                 ˆ
                             ˆ
                                                ˆ
                                     ˆ
                          = Y ’ Y + Y ’  e + e’ Y + e’ e

                                   ˆ
                     Substituting Y  = H Y,  dan e = [I - H] Y  gives

                     Y’ Y  = (H Y)’(H Y) + (H Y)’ ([I - H] Y) + ([I - H] Y)’(H Y) + ([I - H] Y)’( [I - H] Y)

                          = Y’ H’ H Y + Y’ (H’ [I - H] )Y + Y’ ( [I - H]’H ) Y +  Y’ ( [I - H]’ [I - H] ) Y


                     Both H and  [I - H]  is symmetric and idempotent, so that H’ H = H and

                     [I - H]’ [I - H] = [I - H]. The two middle term are zero because the two quadratic

                     forms are orthogonal to each other, ie H’ [I - H] = [H - H]  = 0. Furthermore

                                                          ˆ
                                                       ˆ
                     Y’ Y  = Y’ H Y +  Y’  [I - H] Y  = Y ’ Y +  e’ e                              (G.14)

                     SS(Total) = SS(Model) + SS (Res)                                              (G.15)

                            Total sum of squares was partitioned into two sums of squares, which is

                     the  model  sum  of  squares  and  residual  sum  of  squares.  Both  of  the  sums  of

                     squares    consecutive  states  as  explained  component  and  unexplained
                                                                    ˆ
                                                                ˆ
                     components of the model. SS (Model) =Y ’ Y  = Y’ H Y has a defining matrix H,
                     and SS (Res) = e’ e = Y’  [I - H] Y  has a defining matrix [I - H].
                            If the two defining matrices are multiplied, H [I - H] = 0, so the two sums

                     of squares are mutually orthogonal, then forming additive partition. Degrees of

                     freedom for both the sum of the squares of each is determined by the rank of









                                   ~~* CHAPTER 5   THE MULTIPLE LINEAR REGRESSION MODEL *~~
   108   109   110   111   112   113   114   115   116   117   118