Page 48 - FULL REPORT 30012024
P. 48

iii.   Decision Tree (DT)


                                       A common machine learning approach used for both classification

                                       and regression applications are the decision tree. According to Uddin

                                       et al. (2019) article, it is a supervised learning technique that creates
                                       a tree-like model of choices and potential outcomes. The tree is made

                                       up  of  nodes  and  branches,  where  each  leaf  node  stands  in  for  an

                                       outcome  or  prediction,  each  branch  for  a  decision  rule,  and  each
                                       interior  node  for  a  trait  or  attribute.  The  decision  tree  algorithm

                                       divides the data recursively according to the values of several features
                                       after starting with the complete dataset at the root node. To maximise

                                       information  gain  or  reduce  impurity  at  each  node,  the  splits  are
                                       chosen. Impurity refers to the consistency of the target variable within

                                       a certain subset of data, whereas information gain assesses how much

                                       information a feature contributes in lowering uncertainty about the
                                       outcome. Figure 2.11 illustrates a decision where the choice outcomes

                                       (Class A and Class B) are depicted by rectangles, and each variable
                                       (C1, C2, and C3) is represented by a circle. Each branch is marked

                                       with either "True" or "False" based on the result value from the test
                                       of its predecessor node in order to correctly assign a sample to a class.





















                                                          Figure 2.11 Decision Tree
                                                          (Source: Uddin et al., 2019)









                                                               31
   43   44   45   46   47   48   49   50   51   52   53