Source: View original notebook on GitHub
Category: Machine Learning / Learn ML
Overfitting and Underfitting of Model
good video by nptel for understanding
underfitting the model means our complexity of function(degree) that we are predicting is very low compared to what it should be actually, when training our model
overfitting the model means our complexity of function that we are predicting is very high compared to what it should be actually(that is the error is very low over the training examples), when training our model.
Bias
- difference b/w the actual value and predicted values
Variance
- spreadness of the function from its average predicted function line if lines are drawn for various different training examples .
- or
variance is an error from sensitivity to small fluctuations in the training set
Underfitting
assuming we have our training data, as shown in fig using the simple function over my dataset in some chunks(15 maybe,consisting of 25 training examples each ).
- we are fitting lines over our model ,actual function is shown in blacksince our function was too simple , it results in higher difference b/w actual value and predicted value(higher bias).hence model is said to be underfitting the model.
note that its variance is low
Overfitting
assuming we have our training data, as shown in fig using the complex function maybe polynomial of degree 25 over my dataset in some chunks(3 maybe,consisting of maybe 25 training examples each ).
- we are fitting function over our model , actual function is shown in blacksince our function was too complex, it closely resembles the actual function (lower bias).
Hence model is said to be overfitting the model because it try to memorize the training data rather than learning from it.
note that its variance(values of same function for different training examples) is high here.
