#lecture2 There are colloquial names given to situations where the loss is high for different reasons. By the [[Bias-Variance Decomposition]], the loss can be high either because the variance is large or because the bias is large. Depending on which of these terms is large or small, we call this underfitting or overfitting. | | Overfitting | Underfitting | | -------- | -------- | -------- | | Bias | Low | High | | Variance | High | Low | Informally, overfit models tend to chase down the training points and this is what causes their high variance (i.e. they change a lot when the training points change). Underfit models don't react enough to the training data (i.e. they are too inflexible to fit the training data), which causes their high bias. In the [[Nearby Neighbour Averaging - Cannonball Example]], overfitting and underfitting correspond to too small of a window and too large of a window respectively.