Regression Analysis
Regression Analysis
Regression Analysis
Regression analysis is a mathematical measure of the averages relationship between two or more variable in terms of the original units of data. Types of Regression (i) Simple Regression (Two Variable at a time) (ii) Multiple Regression (More than two variable at a time) Linear Regression: If the regression curve is a straight line then there is a linear regression between the variables .
Non-linear Regression/ Curvilinear Regression: If the regression curve is not a straight line then there is a non-linear regression between the variables.
Yt X t t
Here
Intercepts
t
(i)
Importance of error
t term:
It captures the effect of on the dependent variable of all variable not included in the model. (ii) It captures any specification error related to assumed linear functional form. (iii) It captures the effects of unpredictable random componenets present in the dependent variable.
t Y
t X
t yt Yt Y
t xt X t X
xtyt
xt2
=309/ 7
36/7
44.1428
5.1428
xt yt =157.37
xt 2 = 33.354
Yt =309
Xt = 36
x y x
t 2 t
Yt 19.882 4.717 X t
x y x
2 2 t
Yt 19.882 4.717 X t
rYX .W
rYX .WO
)(1 r
2 YW .O
Partial Correlation
Remarks: 1. Partial correlation coefficients lies between -1 & 1 2. Correlation coefficients are calculated on the bases of zero order coefficients or simple correlation where no variable is kept constant. Limitation: 1. In the calculation of partial correlation coefficients, it is presumed that there exists a linear relation between variables. In real situation, this condition lacks in some cases. 2. The reliability of the partial correlation coefficient decreases as their order goes up. This means that the second order partial coefficients are not as dependable as the first order ones are. Therefore, it is necessary that the size of the items in the gross correlation should be large. 3. It involves a lot of calculation work and its analysis is not easy.
Partial Correlation
Example: From the following data calculate 12.3 x1 : 4 0 1 1 1 3 x2 : 2 0 2 4 2 3 x3 : 1 4 2 2 3 0
Solution:
4 3 4 1 0 0
X1
16 2, 2
X2
16 2 and 2
X3
16 2 2
Partial Correlation
Multiple Correlation
The fluctuation in given series are not usually dependent upon a single factor or cause. For example wheat yields is not only dependent upon rain but also on the fertilizer used, sunshine etc. The association between such series and several variable causing these fluctuation is known as multiple correlation. It is also defined as the correlation between several variable. Co-efficient of Multiple Correlation: Let there be three variable X1, X2 and X3. Let X1 be dependent variable, depending upon independent variable , X2 and X3. The multiple correlation coefficient are defined as follows:
R1.23 = Multiple correlation with X1 as dependent variable and X2. and X3. , as independent variable R2.13 = Multiple correlation with X2 as dependent variable and X1. and X3. , as independent variable R3.12 = Multiple correlation with X3 as dependent variable and X1. and X2 , as independent variable
For example
Remarks
Multiple correlation coefficient is a non-negative coefficient. It is value ranges between 0 and 1. It cannot assume a minus value. If R1.23 = 0, then r12 = 0 and r13=0 R1.23 r12 and R1.23 r13 R1.23 is the same as R1.32 (R1.23 )2 = Coefficient of multiple determination. If there are 3 independent variable and one dependent variable the formula for finding out the multiple correlation is
Limitation
Example
Given the following data X1: 3 5 X2: 16 10 X3: 90 72 6 7 54 8 4 42 12 3 30 14 2 12
Example
Example
Types of Correlation
r12.3 is the correlation between variables 1 and 2 with variable 3 removed from both variables. To illustrate this, run separate regressions using X3 as the independent variable and X1 and X2 as dependent variables. Next, compute residuals for regression...