Page 99 - Data Science Algorithms in a Week
P. 99

Random Forest


            Playing chess example

            We will use the example from the Chapter 2, Naive Bayes and Chapter 3, Decision Tree,
            again.

             Temperature Wind     Sunshine Play
             Cold          Strong Cloudy     No

             Warm          Strong Cloudy     No
             Warm          None   Sunny      Yes
             Hot           None   Sunny      No

             Hot           Breeze Cloudy     Yes
             Warm          Breeze Sunny      Yes

             Cold          Breeze Cloudy     No
             Cold          None   Sunny      Yes
             Hot           Strong Cloudy     Yes

             Warm          None   Cloudy     Yes
             Warm          Strong Sunny      ?

            However, we would like to use a random forest consisting of four random decision trees to
            find the result of the classification.

            Analysis:
            We are given M=4 variables from which a feature can be classified. Thus, we choose the
            maximum number of the variables considered at the node to be
            m=min(M,math.ceil(2*math.sqrt(M)))=min(M,math.ceil(2*math.sqrt(4)))=4.
            We are given the following features:

                [['Cold', 'Strong', 'Cloudy', 'No'], ['Warm', 'Strong', 'Cloudy', 'No'],
                ['Warm', 'None', 'Sunny',
                'Yes'], ['Hot', 'None', 'Sunny', 'No'], ['Hot', 'Breeze', 'Cloudy', 'Yes'],
                ['Warm', 'Breeze',
                'Sunny', 'Yes'], ['Cold', 'Breeze', 'Cloudy', 'No'], ['Cold', 'None',
                'Sunny', 'Yes'], ['Hot', 'Strong', 'Cloudy', 'Yes'], ['Warm', 'None',
                'Cloudy', 'Yes']]





                                                     [ 87 ]
   94   95   96   97   98   99   100   101   102   103   104