0% found this document useful (0 votes)
5 views32 pages

Correlation and Regression

wadaedwqdasdwdsdawdadw

Uploaded by

birwadkarkartik
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views32 pages

Correlation and Regression

wadaedwqdasdwdsdawdadw

Uploaded by

birwadkarkartik
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

Simple Linear Regression and

Correlation Analysis
Dr Richa Chaudhary
IESMCRC

Dr Richa Chaudhary
Correlation means association - more precisely it is a measure of the
extent to which two variables are related.
There are three possible results of a correlational study:
• Positive correlation
• Negative correlation
• Zero correlation.

Dr Richa Chaudhary
Positive correlation is a relationship between two variables in which both
variables move in the same direction. Therefore, when one variable
increases as the other variable increases, or one variable decreases while the
other decreases.
An example of positive correlation would be Income and Expenditure.

Negative correlation is a relationship between two variables in which an


increase in one variable is associated with a decrease in the other.
An example of negative correlation would be Sales and Price

Zero correlation exists when there is no relationship between two variables.


For example their is no relationship between the amount of tea drunk and
level of intelligence.

Dr Richa Chaudhary
Degree Of Correlation
Correlation +ve Correlation -ve

Perfect +1 Perfect -1

High 0.75 - 1 High -0.75 to -1

Moderate. 0.5 to 0.75 Moderate. - 0.5 to - 0.75

Low 0 to. 0.5 Low 0 to. - 0.5

Zero 0 Zero 0

Dr Richa Chaudhary
Scatter Plots and Correlation
• A scatter plot (or scatter diagram) is used to show the
relationship between two variables
• Correlation analysis is used to measure strength of the
association (linear relationship) between two variables
• Only concerned with strength of the relationship
• No causal effect is implied

Dr Richa Chaudhary
Scatter Plot Examples
Linear relationships Curvilinear relationships

y y

x x

y y

x x
Dr Richa Chaudhary
Scatter Plot Examples
(continued)
Strong relationships Weak relationships

y y

x x

y y

x x
Dr Richa Chaudhary
Scatter Plot Examples
(continued)
No relationship

x
Dr Richa Chaudhary
Correlation Coefficient
(continued)

• Correlation measures the strength of the linear


association between two variables

• The sample correlation coefficient r is a measure


of the strength of the linear relationship between
two variables, based on sample observations

Dr Richa Chaudhary
Features of r
• Unit free
• Range between -1 and 1
• The closer to -1, the stronger the negative linear
relationship
• The closer to 1, the stronger the positive linear
relationship
• The closer to 0, the weaker the linear relationship

Dr Richa Chaudhary
Examples of Approximate
r Values
y y y

x x x
r = -1 r = -.6 r=0
y y

x x
r = +.3 Dr Richa Chaudhary r = +1
Calculating the
Correlation Coefficient
Sample correlation coefficient:

r=
 ( x − x)( y − y)
[ ( x − x ) ][ ( y − y ) ]
2 2

or the algebraic equivalent:


n xy −  x  y
r=
[n(  x 2 ) − (  x )2 ][n(  y 2 ) − (  y )2 ]
where:
r = Sample correlation coefficient
n = Sample size
x = Value of the independent variable
y = Value of the dependent variable
Dr Richa Chaudhary
Introduction to
Regression Analysis
• Regression analysis is used to:
• Predict the value of a dependent variable based on the
value of at least one independent variable
• Explain the impact of changes in an independent
variable on the dependent variable
Dependent variable: the variable we wish to explain
Independent variable: the variable used to explain
the dependent variable

Dr Richa Chaudhary
Simple Linear Regression Model

• Only one independent variable, x


• Relationship between x and y is described by
a linear function
• Changes in y are assumed to be caused by
changes in x

Dr Richa Chaudhary
Types of Regression Models
Positive Linear Relationship Relationship NOT Linear

Negative Linear Relationship No Relationship

Dr Richa Chaudhary
Population Linear Regression

The population regression model:


Population Random
Population Independent Error
Slope
y intercept Variable term, or
Coefficient
Dependent residual

y = β0 + β1x + ε
Variable

Linear component Random Error


component

Dr Richa Chaudhary
Estimated Regression Model
The sample regression line provides an estimate of
the population regression line

Estimated Estimate of Estimate of the


(or predicted) the regression regression slope
y value intercept

Independent

ŷ i = b0 + b1x variable

The individual random error terms ei have a mean of zero

Dr Richa Chaudhary
Interpretation of the
Slope and the Intercept
• Intercept- b0 is the estimated average value
of y when the value of x is zero

• Slope- b1 is the estimated change in the


average value of y as a result of a one-unit
change in x

Dr Richa Chaudhary
Finding the
Least Squares Equation
• The coefficients b0 and b1 will usually be
found using computer software, such as Excel
or Minitab

• Other regression measures will also be


computed as part of computer-based
regression analysis

Dr Richa Chaudhary
Simple Linear Regression Example
• A real estate agent wishes to examine the relationship
between the selling price of a home and its size
(measured in square feet)

• A random sample of 10 houses is selected


• Dependent variable (y) = house price in $1000s
• Independent variable (x) = square feet

Dr Richa Chaudhary
Sample Data for
House Price Model
House Price in $1000s Square Feet
(y) (x)
245 1400
312 1600
279 1700
308 1875
199 1100
219 1550
405 2350
324 2450
319 1425
255 1700

Dr Richa Chaudhary
Regression Using Excel
• Data / Data Analysis / Regression

Dr Richa Chaudhary
Excel Output
Regression Statistics
Multiple R 0.76211 The regression equation is:
R Square 0.58082
Adjusted R Square 0.52842 house price = 98.24833 + 0.10977 (square feet)
Standard Error 41.33032
Observations 10

ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Dr Richa Chaudhary
Graphical Presentation
• House price model: scatter plot and regression
line
450
400

House Price ($1000s)


350 Slope
300
250
= 0.10977
200
150
100
50
Intercept 0
= 98.248
0 500 1000 1500 2000 2500 3000
Square Feet

house price = 98.24833 + 0.10977 (square feet)


Dr Richa Chaudhary
Interpretation of the
Intercept, b0

house price = 98.24833 + 0.10977 (square feet)

• b0 is the estimated average value of Y when the


value of X is zero (if x = 0 is in the range of observed
x values)
• Here, no houses had 0 square feet, so b0 = 98.24833 just
indicates that, for houses within the range of sizes
observed, $98,248.33 is the portion of the house price not
explained by square feet

Dr Richa Chaudhary
Interpretation of the
Slope Coefficient, b1

house price = 98.24833 + 0.10977 (square feet)

• b1 measures the estimated change in the


average value of Y as a result of a one-unit
change in X
• Here, b1 = .10977 tells us that the average value of a
house increases by .10977($1000) = $109.77, on average,
for each additional one square foot of size

Dr Richa Chaudhary
Coefficient of Determination, R 2

• The coefficient of determination is the portion of


the total variation in the dependent variable that is
explained by variation in the independent variable

• The coefficient of determination is also called R-


squared and is denoted as R2

0  R2  1

Dr Richa Chaudhary
Coefficient of Determination, R 2

(continued)
Coefficient of determination
SSR sum of squares explained by regression
R =
2
=
SST total sum of squares

Note: In the single independent variable case, the coefficient


of determination is

R =r2 2

where:
R2 = Coefficient of determination
r = Simple correlation coefficient
Dr Richa Chaudhary
Examples of Approximate
R2 Values
y
R2 = 1

Perfect linear relationship


between x and y:
x
R2 = 1
y 100% of the variation in y is
explained by variation in x

x
R2 = +1
Dr Richa Chaudhary
Examples of Approximate
R2 Values
(continued)

y
0 < R2 < 1

Weaker linear relationship


between x and y:
x
Some but not all of the
y
variation in y is explained
by variation in x

x
Dr Richa Chaudhary
Examples of Approximate
R2 Values
(continued)

R2 = 0
y
No linear relationship
between x and y:

The value of Y does not


x depend on x. (None of the
R2 = 0
variation in y is explained
by variation in x)

Dr Richa Chaudhary
THANKS

Dr Richa Chaudhary

You might also like