From our earlier post, we now know the impacts of bias and variance that lead our model to overfit and underfit.
Now let’s dig deeper and see how we can reduce overfitting.
There are several techniques to avoid overfitting in Machine Learning altogether listed below:
- L1 lasso
- L2 ridge
- Reduce the number of features
- Cross-validation Sampling (k cross-validation)
- Batch normalization
Here’s an example that will walk you through the overfitting and underfitting concepts: https://analyticseducator.com/Blog/overfit-vs-underfit.html
The following articles also show us ways to handle/reduce overfitting: