Partial least squares regression (PLS regression) is a statistical method that bears some relation to principal components regression; instead of finding hyperplanes of maximum variance between the response and independent variables, it finds a linear regression model by projecting the predicted variables and the observable variables to a new space. Because both the X and Y data are projected to new spaces, the PLS family of methods are known as bilinear factor models. Partial least squares discriminant analysis (PLS-DA) is a variant used when the Y is categorical.
PLS is used to find the fundamental relations between 2 matrices (X and Y), i.e. a latent variable approach to modeling the covariance structures in these two spaces. A PLS model will try to find the multidimensional direction in the X space that explains the maximum multidimensional variance direction in the Y space. PLS regression is particularly suited when the matrix of predictors has more variables than observations, and when there is multicollinearity among X values. By contrast, standard regression will fail in these cases (unless it is regularized).
Partial least squares was introduced by the Swedish statistician Herman O. A. Wold, who then developed it with his son, Svante Wold. An alternative term for PLS is projection to latent structures,[1][2] but the term partial least squares is still dominant in many areas. Although the original applications were in the social sciences, PLS regression is today most widely used in chemometrics and related areas. It is also used in bioinformatics, sensometrics, neuroscience, and anthropology.
^Wold, S; Sjöström, M.; Eriksson, L. (2001). "PLS-regression: a basic tool of chemometrics". Chemometrics and Intelligent Laboratory Systems. 58 (2): 109–130. doi:10.1016/S0169-7439(01)00155-1. S2CID 11920190.
^Abdi, Hervé (2010). "Partial least squares regression and projection on latent structure regression (PLS Regression)". WIREs Computational Statistics. 2: 97–106. doi:10.1002/wics.51. S2CID 122685021.
and 28 Related for: Partial least squares regression information
Partialleastsquaresregression (PLS regression) is a statistical method that bears some relation to principal components regression; instead of finding...
application Line fitting Nonlinear leastsquares Regularized leastsquares Simple linear regressionPartialleastsquaresregression Linear function Weisstein...
statistics, ordinary leastsquares (OLS) is a type of linear leastsquares method for choosing the unknown parameters in a linear regression model (with fixed...
Weighted leastsquares (WLS), also known as weighted linear regression, is a generalization of ordinary leastsquares and linear regression in which knowledge...
The partialleastsquares path modeling or partialleastsquares structural equation modeling (PLS-PM, PLS-SEM) is a method for structural equation modeling...
The method of leastsquares is a parameter estimation method in regression analysis based on minimizing the sum of the squares of the residuals (a residual...
In applied statistics, total leastsquares is a type of errors-in-variables regression, a leastsquares data modeling technique in which observational...
(as with least absolute deviations regression), or by minimizing a penalized version of the leastsquares cost function as in ridge regression (L2-norm...
In statistics, nonlinear regression is a form of regression analysis in which observational data are modeled by a function which is a nonlinear combination...
partialregression plot attempts to show the effect of adding another variable to a model that already has one or more independent variables. Partial...
variables and a dependent variable. Standard types of regression, such as ordinary leastsquares, have favourable properties if their underlying assumptions...
analysis. Ridge regression was developed as a possible solution to the imprecision of leastsquare estimators when linear regression models have some...
Quantile regression is a type of regression analysis used in statistics and econometrics. Whereas the method of leastsquares estimates the conditional...
Segmented regression, also known as piecewise regression or broken-stick regression, is a method in regression analysis in which the independent variable...
combines much of the simplicity of linear leastsquaresregression with the flexibility of nonlinear regression. It does this by fitting simple models to...
used for estimating the unknown regression coefficients in a standard linear regression model. In PCR, instead of regressing the dependent variable on the...
learning, moving leastsquares methods have also been used to develop model classes and learning methods. This includes function regression methods and neural...
Robust Regression, Course Notes, University of Minnesota Numerical Methods for LeastSquares Problems by Åke Björck (Chapter 4: Generalized LeastSquares Problems...
is still unaccounted for. For regression models, the regression sum of squares, also called the explained sum of squares, is defined as S S reg = ∑ i (...
In statistics, generalized leastsquares (GLS) is a method used to estimate the unknown parameters in a linear regression model. It is used when there...
packages perform leastsquaresregression analysis and inference. Simple linear regression and multiple regression using leastsquares can be done in some...
In statistics, simple linear regression (SLR) is a linear regression model with a single explanatory variable. That is, it concerns two-dimensional sample...
In statistics, polynomial regression is a form of regression analysis in which the relationship between the independent variable x and the dependent variable...
issues in the context of a regression are sometimes referred to as endogenous. In this situation, ordinary leastsquares produces biased and inconsistent...
Latent variable model Item response theory Partialleastsquares path modeling Partialleastsquaresregression Proxy (statistics) Rasch model Structural...