본문 바로가기

Decision Tree

(4)
[R] Tree-Based Methods : Boosting 1. What is Boosting? Boosting is a general approach that can be applied to many statistical learning methods for regression or classification. Boosting grow trees sequentially. The boosting approach learns slowly. Given the current model, we fit a decision tree to the residuals from the model. We then add this new decision tree into the fitted function in order to update the residuals. Each of t..
[R] Tree-Based Methods : Classification Decision Tree 1. What is Classification Decision Tree? Predict a qualitative response rather than a quantitative one. Predict that each observation belongs to the most commonly occuring class. Use recursive binary splitting to grow a classification tree. Use classification error rate(missclassification rate) as evaluation metrics. Splitting metrics The classification error rate : \(Error = 1 - max_{k}(\hat{p}..
[R] Tree-Based Methods : Regression Decision Tree 1. Regression Decision Tree 1.1 [Ex] Finding optimal value \(\alpha\) using CV # Import library and dataset library(tree) data(Hitters) # Training models miss
[R] Tree-Based Methods : Decision Tree 1. Tree-Based Methods Tree-based methods for regression and classification involve stratifying or segmenting the predictor space into a number of simple regions The set of splitting rules used to segment the predictor space can be summarized in a tree. Tree-based methods are simple and useful for interpretation. Bagging and Random Forests grow multiple trees which are combined to yield a single ..