Page 172 - Data Science Algorithms in a Week
P. 172

156                     Djordje Cica and Davorin Kramar


                                                1   N   exp  1  N out  O exp   O out 
                           E  w (1) ,w (2) ,b (1) ,b (2)    exp     out   i  i              (5)
                                                      
                                               N    m  1 N  i  1  O i exp    m 
                                                      

                                                                                  exp
                               out
                       where  N   is  the  number  of  neurons  of  the  output  layer,  N   is  the  number  of
                       experimental  patterns  and  O out    and  O exp  are  the  normalized  predicted  and  measured
                                                 i         i
                       values, respectively.
                          The error obtained from previous equation is back-propagated into the ANN. This
                       means  that  from  output  to  input  the  weights  of  the  synapses  and  the  biases  can  be
                       modified which will result in minimum error. Several network configuration were tested
                       with different numbers of hidden layers and various neurons in each hidden layer using a
                       trial  and  error  procedure.  The  best  network  architecture  was  a  typical  two-layer  feed-
                       forward  network  with  one  hidden  layer  with  10  neurons  that  was  trained  with  a
                       Levenberg-Marquardt back-propagation algorithm. These ANN architecture will be used
                       in the next presentation and discussion.


                       Bio-Inspired Artificial Neural Networks

                          Regarding  the  feedforward  ANN  training,  the  mostly  used  training  algorithm  is
                       standard BP algorithm or some improved BP algorithms. Basically, the BP algorithm is a
                       gradient-based method. Hence, some inherent problems are frequently encountered in the
                       use of this algorithm, e.g., risk of being trapped in local minima, very slow convergence
                       rate  in  training,  etc.  In  addition,  there  are  many  elements  to  be  considered  such  are
                       number  of  hidden  nodes,  learning  rate,  momentum  rate,  bias,  minimum  error  and
                       activation/transfer function, which also affect the convergence of BP learning. Therefore,
                       recent  research  emphasis has  been  on optimal  improvement  of  ANN  with  BP  training
                       method.
                          The  learning  of  ANN  using  bio-inspired  algorithms  has  been  a  theme  of  much
                       attention  during  last  few  years.  These  algorithms  provide  a  universal  optimization
                       techniques which requires no particular knowledge about the problem structure other than
                       the objective function itself. They are robust and efficient at exploring an entire, complex
                       and  poorly  understood  solution  space  of  optimization  problems.  Thus,  bio-inspired
                       algorithms are capable to escape the local optima and to acquire a global optima solution.
                       Bio-inspired algorithms have been successfully used to perform various tasks, such as
                       architecture design, connection weight training, connection weight initialization, learning
                       rule  adaptation,  rule  extraction  from  ANN,  etc.  One  way  to  overcome  BP  training
                       algorithm shortcomings is to formulate an adaptive and global approach to the learning
                       process  as  the  evolution  of  connection  weights  in  the  environment  determined  by  the
                       architecture and the learning task of ANN. Bio-inspired algorithms can then be used very
   167   168   169   170   171   172   173   174   175   176   177