1 Briefly outline the major steps of decision tree classification.
2 Why is tree pruning useful in decision tree induction? What is a drawback of using a separate set of tuples to evaluate pruning?
3 Given a decision tree, you have the option of (a) converting the decision tree to rules and then pruning the resulting rules, or (b) pruning the decision tree and then converting the pruned tree to rules. What advantage does (a) have over (b)?
4 It is important to calculate the worst-case computational complexity of the decision tree algorithm. Given data set, D, the number of attributes, n, and the number of training tuples, |D|, show that the computational cost of growing a tree is at most n × |D| × log(|D|).
5.Why is na¨ıve Bayesian classification called “na¨ıve”? Briefly outline the major ideas of na¨ıve Bayesian classification
6.Show that accuracy is a function of sensitivity and specificity
No comments:
Post a Comment