Method for estimating the unknown parameters in a linear regression model
Part of a series on
Regression analysis
Models
Linear regression
Simple regression
Polynomial regression
General linear model
Generalized linear model
Vector generalized linear model
Discrete choice
Binomial regression
Binary regression
Logistic regression
Multinomial logistic regression
Mixed logit
Probit
Multinomial probit
Ordered logit
Ordered probit
Poisson
Multilevel model
Fixed effects
Random effects
Linear mixed-effects model
Nonlinear mixed-effects model
Nonlinear regression
Nonparametric
Semiparametric
Robust
Quantile
Isotonic
Principal components
Least angle
Local
Segmented
Errors-in-variables
Estimation
Least squares
Linear
Non-linear
Ordinary
Weighted
Generalized
Generalized estimating equation
Partial
Total
Non-negative
Ridge regression
Regularized
Least absolute deviations
Iteratively reweighted
Bayesian
Bayesian multivariate
Least-squares spectral analysis
Background
Regression validation
Mean and predicted response
Errors and residuals
Goodness of fit
Studentized residual
Gauss–Markov theorem
Mathematics portal
v
t
e
In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression model (with fixed level-one[clarification needed] effects of a linear function of a set of explanatory variables) by the principle of least squares: minimizing the sum of the squares of the differences between the observed dependent variable (values of the variable being observed) in the input dataset and the output of the (linear) function of the independent variable.
Geometrically, this is seen as the sum of the squared distances, parallel to the axis of the dependent variable, between each data point in the set and the corresponding point on the regression surface—the smaller the differences, the better the model fits the data. The resulting estimator can be expressed by a simple formula, especially in the case of a simple linear regression, in which there is a single regressor on the right side of the regression equation.
The OLS estimator is consistent for the level-one fixed effects when the regressors are exogenous and forms perfect colinearity (rank condition), consistent for the variance estimate of the residuals when regressors have finite fourth moments[1] and—by the Gauss–Markov theorem—optimal in the class of linear unbiased estimators when the errors are homoscedastic and serially uncorrelated. Under these conditions, the method of OLS provides minimum-variance mean-unbiased estimation when the errors have finite variances. Under the additional assumption that the errors are normally distributed with zero mean, OLS is the maximum likelihood estimator that outperforms any non-linear unbiased estimator.
^"What is a complete list of the usual assumptions for linear regression?". Cross Validated. Retrieved 2022-09-28.
and 25 Related for: Ordinary least squares information
In statistics, ordinaryleastsquares (OLS) is a type of linear leastsquares method for choosing the unknown parameters in a linear regression model (with...
Weighted leastsquares (WLS), also known as weighted linear regression, is a generalization of ordinaryleastsquares and linear regression in which knowledge...
including variants for ordinary (unweighted), weighted, and generalized (correlated) residuals. Numerical methods for linear leastsquares include inverting...
In statistics, generalized leastsquares (GLS) is a method used to estimate the unknown parameters in a linear regression model. It is used when there...
purpose of this page is to provide supplementary materials for the ordinaryleastsquares article, reducing the load of the main article with mathematics...
system exceeds the number of observations. In such settings, the ordinaryleast-squares problem is ill-posed and is therefore impossible to fit because...
{\boldsymbol {\beta }}=\mathbf {d} } (see Ordinaryleastsquares). Stochastic (linearly) constrained leastsquares: the elements of β {\displaystyle {\boldsymbol...
Partial leastsquares regression (PLS regression) is a statistical method that bears some relation to principal components regression; instead of finding...
In applied statistics, total leastsquares is a type of errors-in-variables regression, a leastsquares data modeling technique in which observational...
interest is correlated with the error term (endogenous), in which case ordinaryleastsquares and ANOVA give biased results. A valid instrument induces changes...
total sum of squares = explained sum of squares + residual sum of squares. For a proof of this in the multivariate ordinaryleastsquares (OLS) case, see...
simple general linear model, and it can be estimated for example by ordinaryleastsquares. Unfortunately, the task of decomposing the estimated matrix Π ^...
different sizes and A {\displaystyle A} may be non-square. The standard approach is ordinaryleastsquares linear regression.[clarification needed] However...
and a dependent variable. Standard types of regression, such as ordinaryleastsquares, have favourable properties if their underlying assumptions are...
For example, the method of ordinaryleastsquares computes the unique line (or hyperplane) that minimizes the sum of squared differences between the true...
assumed that the variances are proportional to the squared distances. The ordinary and weighted leastsquares methods described above assume independent distance...
values. It is analogous to the leastsquares technique, except that it is based on absolute values instead of squared values. It attempts to find a function...
be measured with two sums of squares formulas: The sum of squares of residuals, also called the residual sum of squares: S S res = ∑ i ( y i − f i ) 2...
Heteroscedasticity does not cause ordinaryleastsquares coefficient estimates to be biased, although it can cause ordinaryleastsquares estimates of the variance...
stipulation that the ordinaryleastsquares (OLS) method should be used: the accuracy of each predicted value is measured by its squared residual (vertical...
{T}}} . Suppose that we wish to estimate a linear model using linear leastsquares. The model can be written as y = X β + ε , {\displaystyle \mathbf {y}...
outcomes on the selected principal components as covariates, using ordinaryleastsquares regression (linear regression) to get a vector of estimated regression...
The vector of estimated polynomial regression coefficients (using ordinaryleastsquares estimation) is β → ^ = ( X T X ) − 1 X T y → , {\displaystyle {\widehat...
In statistics, the explained sum of squares (ESS), alternatively known as the model sum of squares or sum of squares due to regression (SSR – not to be...