Page 83 - Data Science Algorithms in a Week
P. 83
Decision Trees
The decision tree algorithm can achieve a different result from other algorithms such as
Naive Bayes' algorithm. In the next chapter, we will learn how to combine various
algorithms or classifiers into a decision forest (called random forest) in order to achieve a
more accurate result.
Problems
1. What is the information entropy of the following multisets?
a) {1,2}, b) {1,2,3}, c) {1,2,3,4}, d) {1,1,2,2}, e) {1,1,2,3}
2. What is the information entropy of the probability space induced by the biased
coin that shows heads with the probability 10% and tails with the probability
90%?
3. Let us take another example of playing chess from Chapter 2, Naive Bayes:
a) What is the information gain for each of the non-classifying attributes in the
table?
b) What is the decision tree constructed from the given table?
c) How would you classify a data sample (warm,strong,spring,?) according
to the constructed decision tree?
Temperature Wind Season Play
Cold Strong Winter No
Warm Strong Autumn No
Warm None Summer Yes
Hot None Spring No
Hot Breeze Autumn Yes
Warm Breeze Spring Yes
Cold Breeze Winter No
Cold None Spring Yes
Hot Strong Summer Yes
Warm None Autumn Yes
Warm Strong Spring ?
[ 71 ]