A statistical technique for measuring/quantifying the relationship among variables.
(Ebenge Usip's Glossary of Statistical Terms )----------------------------
A statistical technique for measuring/quantifying the type of causal relationship among variables. One variable is the Dependent Variable (DV) while the others are the Independent Variables (IVs). The analysis is called Simple Regression if there is only one IV in the model; it is called Multiple Regression if there are two or more IVs in the model. A regression model whether in the Simple or Multiple form can be used for prediction purposes as well as for testing existing economic theories, among others.
The term regression was introduced by Francis Galton (1886) in his famous paper. In this paper,he found that although there was a tendency for tall parents to have tall children and for short parents to have short children, the average height of children born of parents of a given height tended to move or "regress" toward the average height in the population as a whole. Galton's law of universal regression, as it later came to be known, was confirmed by his friend, Karl Pearson (1903), who used more than a thousand records of heights of members of family groups.
In regression analysis, the linear approximation of the conditional expectation function is referred to as the regression line. For the case of two variables, Y, and X, regressing Y on X provides a linear approximation to E(Yi | Xi) by minimizing the sum of the mean squared errors around the line.