1. Regularization of Logistic Regression
Because we don't know how many theta can affect overfitting, we make all theta become small.
To regularize overfitting problem, we need to change parameter in gradient descent as well.
But in cost function for solving overfitting problem, cost function apply
Because
In normal equation, the equation is the same as our original, except that we add another term inside the parenthese.
In this equation we can regulate overfitting problem + non-invertible problem.
2. Regularization of Logistic Regression
We can logistic regression in a similar way that we regularize linear regression. We can regularize logistic cost function by adding term to the end :
Gradient descent of logistic regression is same with those of linear regression.
'Data Science > Classification' 카테고리의 다른 글
[Models] Classification Models (0) | 2022.09.20 |
---|---|
[Theorem] Bias vs Variance (0) | 2022.09.19 |
[Theorem] Validation Sets (1) | 2022.09.19 |
[Theorem] Overfitting (0) | 2022.09.19 |
[Theorem] Logistic Regression (1) | 2022.09.19 |