The coefficient of determination ($R^2$) is the proportion of observed variation in $y$ explained by the model, equal to one less than the ratio of [[residual sum of squares|RSS]] to [[total sum of squares|TSS]]
$R^2 = 1 - \frac{RSS}{TSS}$
Note that $0 \le R^2 \le 1$
Warnings about $R^2$
- $R^2$ can be close to $1$ but the model is the wrong fit for the data
- $R^2$ can be close to 0 even when the model is the correct fit for the data
- $R^2$ should not be used to compare models with different numbers of predictors. Adding predictors to a model will always increase $R^2$. Use [[Adjusted R-squared]].
- $R^2$ says nothing about the causal relationship between the predictors and the response (neither does linear regression!).