regularization (1) 썸네일형 리스트형 [Theorem] Regularization 1. Regularization of Logistic Regression Because we don't know how many theta can affect overfitting, we make all theta become small. $$ \left(J(\theta )=\frac{1}{2m}\sum _{i=1}^m(h_{\theta }(x^{(i)})-y^{(i)})^2+\lambda \sum _{j=1}^m\theta _j^2)\right) $$ \(\lambda\) is called the regularization parameter which controls a trade off between two different goals. The first goal is that we would lik.. 이전 1 다음