Overfitting is a problem in which model remember the original data which was used to train the model. Overfit model don`t work well on new dataset. In real world you encounter new data everyday.
Solution
To avoid overfitting you need to collect more data, since you will have lots of data and your model will not be able to remember all the data, so your model will try to find a simple pattern by which it can give correct answer.
Dropout
you can also use dropouts which will drop a neuron by making its value equal to zero. you have to define percentage for each neuron to be dropped. Since neural network learn by adjusting weights of neurons, so dropping neurons will make it harder for neurons to learn the data, that way model will only learn the pattern.
You can also drop a block of layers, such as residual block. You can also drop a connection toward a layer.
Regularization
Regularization is a technique which modify the loss of a neural network based on sum of weights. The outcome of regularization is that the difference between weights will be smoothed out. all the weights will be lose to each other with less variance among them.
Normalization
Normalization is method by which you can reduce values and make them between a certain range such as between -1 and 1. similarly you can normalize the output of a neural network layer, using batch normalization or instance normalization.
No comments:
Post a Comment