1. Intersection between Bias and Variance
Let's review about overfitting problem and underfitting problem. Underfitting problem is the problem when we use too much low degree polynomial term. Overfitting problem is the problem when we use too much high degree polynomial term.

So, when we plot training error of
So we can divide case into Bias(underfitting) and Variance(overfitting).
2. Regularization
To solve bias vs variance problem, we need to use regularization parameter lambda. Let's set list of
3. Learning Curves
Learning curves is the graph of the number of examples. In high bias model, low training set size cause
In high variance model, low training set size causes
4. Summarization
After making machine learning algorithm, we need to debug a learning algorithm with below option :
- Get more training examples(High variance)
- Try smaller sets of features(High variance)
- Try getting additional features(High bias)
- Try adding polynomial features(High bias)
- Try decreasing lambda(High bias)
- Try increasing labmda(High variance)
'Data Science > Classification' 카테고리의 다른 글
| [Models] Classification Models (0) | 2022.09.20 |
|---|---|
| [Theorem] Validation Sets (1) | 2022.09.19 |
| [Theorem] Regularization (0) | 2022.09.19 |
| [Theorem] Overfitting (0) | 2022.09.19 |
| [Theorem] Logistic Regression (1) | 2022.09.19 |