Page 148 - Data Science Algorithms in a Week
P. 148

132                 Alfonso T. Sarmiento and Edgar Gutierrez

                       Step  7)  Neighborhood  best  updating:  Determine  the  neighborhood  best  position  ˆ y  (k)
                                                                                                  i
                              visited so far by the whole swarm by using the formula

                              J(y ˆ i ( k)) = min{ J(y j ( k))},  j ∈ B
                                                         i

                       Step 8) Global best updating: Determine the global best position  g (k)visited so far by
                              the whole swarm by using the formula

                              J(g (k) ) =  min{ J(y  (k ))}, i = 1,..,N.
                                             i
                              If  J(g (k) )    < J(g (k  - 1) )  then set k’ = k

                       Step  9)  Stopping  criteria: If  the  maximum  number  of  iterations is  achieved  then  stop,

                              g*=  g (k) is the optimal solution; otherwise go to step 2.


                       Local Search: Powell Hill-Climbing Algorithm

                          PHC method basically uses one-dimensional minimization algorithms to solve multi-
                       dimensional optimization problems. The procedure searches into a region by constructing
                       a set of linearly independent, mutually “non-interfering” or conjugate search directions
                       and  applies  linear  minimization  to  move  into  each  direction  (Press  et  al.  1992).  The
                       number  of  conjugate  directions  coincides  with  the  dimension  of  the  search  space  and
                       their linear independence guarantees the whole search space can be covered. The use of
                       conjugate directions has the advantage that minimization in one direction is not interfered
                       by  subsequent  minimization  along  another  direction,  avoiding  endless  cycling  through
                       the set of directions.
                          The steps of the algorithm are described in the following lines:

                       Step 1) Initialization:
                                Set iteration k = 0

                                Set the initial search point  Z 0  =  [z 1  z   ,  2 ,.., z n p  ]  as the optimal solution of the

                                 PSO algorithm, i.e.,  Z =  g *
                                                     0
                                Initialize directions ud to the basis vectors, i.e., ud = ed, d = 1,..,np, where
                              e 1  =  0   , 1 [  ,.., 0    ],e 2  =  1   , 0 [  ,.., 0 ],...,e n p  =  0   , 0 [  ,..,  ] 1


                       Step 2) Define the iteration start point: Set S =  Z k
                                                              0
   143   144   145   146   147   148   149   150   151   152   153