Skip to main content

Ordinary Least Squares Estimator

Go Search
Home
About
New Atlas
Atlas, A-Z
Atlas Maps
MPP/MPA Programs
Subjects
Core Topics
Illustrative Courses
Topic Encyclopedia
Concept Dictionary
Competencies
Career Tips
IGOs
Best Practices Project


 
PPGPortal > Home > Concept Dictionary > N, O > Ordinary Least Squares Estimator
 

Ordinary Least Squares Estimator  

The estimator of the regression intercept and slope(s) that minimizes the sum of squared residuals.

(Stock, James H. and Mark. W. Watson. 2007. Introduction to Econometrics, 2nd ed. Boston: Pearson/Addison Wesley.)

---------------------------------

The ordinary least squares estimator, commonly known as the OLS estimator, is the most commonly used form of regression analysis. One reason that OLS is such a popular approach to regression analysis is that under many circumstances, the OLS estimator is the best linear unbiased estimator of the regression coefficients conditional on the values of the regressors. This mathematical result is known as the "Gauss-Markov Theorem."

Usually, a major objective of OLS regression analysis is to produce an unbiased estimate of the causal relationship between two different variables. Conceptually, if the OLS is unbiased, this means that if one were to draw a very large number of samples of any size, estimate the OLS parameters for each of these samples, and then find the average of these estimates, the average would equal the true parameter values.

Generally speaking, we can say that an OLS estimator will be unbiased if the following two conditions are met:

1.) The linear regression model of the outcome variable is consistent with the underlying data generating process;

2.) The linear regression model does not include lagged values of an outcome variable as a covariate.

(Paul Grootendorst, University of Toronto, PPG2010)

     

Important Notices
© University of Toronto 2008
School of Public Policy and Governance