Independent Variables

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 3

The term regression means ‘stepping back towards the average’.

Regression analysis is a mathematical


measure of the average relationship between two or more variables in terms of original units of data.
It is a set of statistical methods used for the estimation of relationships between a dependent variable
and one or more independent variables. It can be utilized to assess the strength of the relationship
between variables and for modeling the future relationship between them. The variable whose value is
influenced or is to be predicted is called dependent variable whereas the variable which influences the
value or is used for prediction is called independent variable. The dependent variable is also known as
as regressed or explained variable while independent variable is called as regressor or predictor or
explanator variable. For example: within certain limits, the yield of a crop depends upon the amount of
amount of fertilizer used. Here , yield is known as the dependent variable and fertilizer is known as the
independent variable. This general process of predicting the value of one variable , on the basis of
known value of other variables is known as regression analysis. Thus regression analysis studies the
statistical relationship between variables.
The main objective of regression analysis is to predict or estimate the value of dependent
variable corresponding to a given value of independent variables.

Regression analysis includes several variations, such as linear, multiple linear, and nonlinear. The most
common models are simple linear and multiple linear. Nonlinear regression analysis is commonly used
for more complicated data sets in which the dependent and independent variables show a nonlinear
relationship.
Regression Analysis – Linear model assumptions
Linear regression analysis is based on six fundamental assumptions:
1. The dependent and independent variables show a linear relationship between the slope and the
intercept.
2. The independent variable is not random.
3. The value of the residual (error) is zero.
4. The value of the residual (error) is constant across all observations.
5. The value of the residual (error) is not correlated across all observations.
6. The residual (error) values follow the normal distribution.

Regression Analysis – Simple linear regression


When regression analysis has been developed to study and measure the statistical relationship between
two variables only then the process is known as the simple regression analysis. The simple linear model
is expressed using the following equation:
Y = a + bX + ϵ
 
Where:
 Y – Dependent variable
 X – Independent (explanatory) variable
 a – Intercept
 b – Slope
 ϵ – Residual (error)

Regression Analysis – Multiple linear regression


Multiple linear regression analysis is essentially similar to the simple linear model, with the exception
that multiple independent variables are used in the model. The mathematical representation of multiple
linear regression is:
Y = a + bX1 + cX2  + dX3 + ϵ
 
Where:
 Y – Dependent variable
 X1, X2, X3 – Independent (explanatory) variables
 a – Intercept
 b, c, d – Slopes
 ϵ – Residual (error)
Multiple linear regression follows the same conditions as the simple linear model. However, since there
are several independent variables in multiple linear analysis, there is another mandatory condition for
the model:
 Non-collinearity: Independent variables should show a minimum of correlation with each
other. If the independent variables are highly correlated with each other, it will be difficult to
assess the true relationships between the dependent and independent variables.

You might also like