Page 49 - FULL REPORT 30012024
P. 49

iv.    Random Forest (RF)


                                       According to Islam et al. (2021), Random Forest (RF) is an algorithm

                                       for supervised learning. It generates a "forest" from a collection of

                                       decision  trees  that  are  typically  trained  through  a  "bagging"
                                       procedure.  RF  is  one  of  the  first  and  most  well-known  machine

                                       learning techniques is the decision tree (DT). In order to classify data

                                       objects  into  a  tree-like  structure,  decision  logics,  or  tests  and
                                       corresponding results, are modelled as decision trees (Uddin et al.,

                                       2019). RF is an ensemble learning technique that combines a number
                                       of decision trees to produce an effective predictive model. By using a

                                       technique called bagging, each tree in the forest is trained using a
                                       random  subset  of  the  training  data,  and  each  node  also  takes  into

                                       account a random sample of features. RF accomplish this by lowering

                                       the  connection  between  trees  and  boosting  ensemble  variety.  This
                                       enhances the model's generalisation ability and lessens overfitting.

                                       The  RF  combines  all  individual  trees'  forecasts  to  create  a  final
                                       prediction when making predictions. Figure 2.12 shows an example

                                       of a random forest made up of three distinct decision trees. A random
                                       subset  of  the  training  data  was  used  to  train  each  of  those  three

                                       decision trees.


















                                                             Figure 2.12 Random Forest
                                                             (Source: Uddin et al., 2019)










                                                               32
   44   45   46   47   48   49   50   51   52   53   54