Data Science/Neural Network (8) 썸네일형 리스트형 [Tensorflow] Binary Classification 1. Binary Classification Classification into one of two classes is a common machine learning problem. We might want to predict whether or not a customer is likely to make a purchase, whether or not a credit card transaction was fadulent, whether deep space signals show evidence of a new planet, or a medical test evidence of a disease. These are all binary classification problems. In our raw data.. [Tensorflow] Dropout and Batch Normalization 1. Dropout Dropout layer can help correcting overfitting. Overfitting is caused by the network spurious patterns in the training data. To recognize these spurious patterns a network will often rely on very specific combinations of weight, a kind of "conspiracy" of weights. Being so specific, they tend to be fragile : remove one and the conspiracy falls apart. This is the idea behind dropout. To .. [Tensorflow] Overfitting and Underfitting 1. Interpreting the Learning Curves We might think about the information in the training data as being of two kinds : signal and noise. The signal is the part that generalizes, the part that can help our model make predictions from new data. The noise is that part is only true of the training data; the noise is all of the random fluctuation that comes from data in the real-world or all of the in.. [Tensorflow] Stochastic Gradient Descent 1. The Loss Function The loss function measures the disparity between the target's true value and the value the model predicts. Different problems call for different loss functions. We've been looking at regression problems, where the task is predict some numerical value. A common loss function for regression problem is the mean absolute error or MAE. For each prediction y_pred, MAE measures the.. [Tensorflow] Deep Neural Networks 1. Layers Neural networks typically organize their neurons into layers. When we collect together linear units having a common set of inputs we get a dense layer. We could think of each layer in a neural network as performing some kind of relatively simple transform its input in more complex ways. In a well-trained neural netwrok, each layer is a transformation getting us a little bit closer to a.. [Tensorflow] A Single Neuron 1. What is Deep Learning? Some of the most impressive advances in artificial intelligence in recent years have been in the field of deep learning. Natural language translation, image recognition, and game playing are all tasks where deep learning models have neared or even exceeded humal-level performance. So what is deep learning? Deep learning is an approach to machine laerning characterized b.. [Theorem] Optimizing Neural Network 1. Unrolling Parameter In neural network, there are differneces at advanced optimziation. In logistic regression. our parameter \(\theta\) is a vector only has a one column. But in neural network, activation node isn't a vector but matrix. So if we want to do back propagation, we need to do unrolling parameters. 2. Gradient Checking One property of back propagation is that there are many ways to.. [Theorem] Neural Network 1. What is Neural Network Polynomial terms in linear regression and logistic regression, we have heavy features to set hypothesis. For example, if we have \(50 \times 50\) pixel images, then total pixels becomes 2500. So total features of logistic regression becomes \(n = 2500 + \alpha\) (very big, when applying polynomial term). If we have too many features, we can have overfitting problem and .. 이전 1 다음