Uncategorized

Using Partial Least Squares to Conduct Relative Importance analysis in R

FavoriteLoadingAdd to favorites

Partial Least Squares (PLS) is a popular method for relative importance analysis in fields where the data typically includes more predictors than observations. Relative importance analysis is a general term applied to any technique used for estimating the importance of predictor variables in a regression model. The output is a set of scores which enable the predictor variables to be ranked based upon how strongly each influences the outcome variable. There are a number of different approaches to calculating relative importance analysis including Relative Weights and Shapley Regression as described here and here. In this blog post I briefly describe how to use an alternative method, Partial Least Squares, in R. Because it effectively compresses the data before regression, PLS is particularly useful when the number of predictor variables is more than the number of observations. PLS is a dimension reduction technique with some similarity to principal…
Original Post: Using Partial Least Squares to Conduct Relative Importance analysis in R