Least Absolute Shrinkage and Selection Operator (LASSO) is used to address multicollinearity in a linear model by introducing a [[regularization]] term to the regression equation, similar to [[ridge regression]]. The regularization (or penalty) term is the sum of the absolute values of the coefficients, multiplied by a tuning parameter $\lambda$. This penalty term is also known as L1 regularization. The purpose of this penalty term is to shrink the coefficient estimates towards zero and force some of them to become exactly zero. As a result, parameters are more likely to zero out in lasso regression than ridge regression, which can be helpful for [[feature selection]] and leads to a simpler more interpretable model. The value of the tuning parameter λ controls the amount of shrinkage applied to the coefficients. A larger value of λ leads to more shrinkage, and more coefficients are set to zero. The optimal value of λ can be determined using cross-validation techniques. Lasso regression has several advantages over traditional linear regression, such as reducing overfitting and improving the interpretability of the model. However, it has some limitations, such as difficulty in handling multicollinearity among the variables, and it may not perform well when the number of variables is much larger than the number of observations. Use the `glmnet` package to perform lasso regression in [[R]]. The `mixture` argument specifies the amount of different types of regularization. Set `mixture=0` for ridge regression, `mixture=1` for [[lasso regression]], and something in between to use both L2 and L1 regularization. The below code requires `tidymodels`. ```R lasso_spec <- linear_reg(mixture = 1, penalty = 0) %>% set_mode("regression") %>% set_engine("glmnet") lasso_fit <- fit(ridge_spec, y ~ x, data=data) tidy(ridge_fit) ``` For a full workflow, including tuning of the penalty parameter, see the [[ridge regression]] note. > [!Tip]- Additional Resources > - [Data Aspirant - How LASSO regresion works in machine learning](https://dataaspirant.com/lasso-regression/) > - [Pluralsight - Linear, Lasso and Ridge Regression with R](https://www.pluralsight.com/resources/blog/guides/linear-lasso-and-ridge-regression-with-r) > - [R for HR eBook - Chapter 54: Supervised Statistical Learning Using Lasso Regression](https://rforhr.com/lassoregression.html) > - [datacamp - Regularization in R Tutorial: Ridge, Lass and Elastic Net](https://www.datacamp.com/tutorial/tutorial-ridge-lasso-elastic-net)