Wednesday, February 29, 2012

Classification with Decision Trees

Feb. 27 was the Tree Day in our class. Decision Trees are one the more exiting DM algorithms.



I started with describing splitting rules and went over the Entropy and Gini examples. Other details such as leaf purity, English rules, and prunning were discussed during the activity time.


Here are the slides and here is the class activity. To help student understand the calculation for Entropy and Gini trees, I recreated the book example in Excel. You may find it useful for students too. 

It is very important to remind students to pay attention to number of observations in leaves. Number of observations in each leaf indicates the extent to which the corresponding English rules. Generalizability & pruning should be adequately discussed in DTree lecture.








Note: I change the location of my files, if you need a file and couldn't find it on the link below, write to me: elahe dot j dot marmarchi at gmail dot com

No comments:

Post a Comment