You built a DL model, and while training it, you noticed that after a certain number of epochs, the accuracy is decreasing. What’s the problem and how to fix it? The answer should be around overfitting.
4 years ago
Machine Learning
It seems the model is learning the exact dataset characteristics rather than capturing its features, and it is called overfitting the model. Probably the model is very complex in comparison to the dataset. The model is complex in terms of having many layers and neurons than needed.
Depending on the situation, there are several ways to fix this overfitting model. The most common are early stopping and dropout regularization.
Early stopping is what it sounds like: stop the training early once you start seeing the drop in the accuracy. Dropout regularization is dropping some outputs layers or nodes. Thus, the remaining nodes have different weights and have to do extra work to capture the characteristics.
Sanisha Maharjan
Jan 11, 2022