Reading Questions for Lecture 2
Decision Trees (Ch. 9.1-9.4)
- Why do we say that decision trees learning is greedy?
- Why is entropy useful for calculating node purity?
- How do you calculate the "rule support" for an IF-THEN rule obtained from a decision tree?
- (Lewis) How is pre-pruning done?
Evaluation (Ch. 14.1-14.3)
- In K-fold cross-validation, how many folds will any two training sets share?
- What are the advantages and drawbacks using a large value of K for K-fold cross-validation?
- (TK) What is stratification?
- (TK) What is leave-one-out cross-validation (LOOCV)?
- (TK) Does LOOCV permit you to do stratification?
- (Natalia) How can you get different classifiers from the same training data, and why would you want to do that?