0% found this document useful (0 votes)
83 views27 pages

Applied Quantitative Analysis and Practices: Lecture#22

1) The document discusses simple linear regression analysis and its application to predicting house prices based on square footage. 2) A sample of house price and square footage data is analyzed using least squares regression, resulting in a linear prediction equation. 3) The slope and intercept of the regression line are interpreted, with the slope representing the change in predicted house price for each additional square foot. 4) The regression model is used to predict the price of a house with 2000 square feet, illustrating how the model can be applied.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
83 views27 pages

Applied Quantitative Analysis and Practices: Lecture#22

1) The document discusses simple linear regression analysis and its application to predicting house prices based on square footage. 2) A sample of house price and square footage data is analyzed using least squares regression, resulting in a linear prediction equation. 3) The slope and intercept of the regression line are interpreted, with the slope representing the change in predicted house price for each additional square foot. 4) The regression model is used to predict the price of a house with 2000 square feet, illustrating how the model can be applied.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 27

Applied Quantitative Analysis

and Practices

LECTURE#22

By
Dr. Osman Sadiq Paracha
Previous Lecture Summary
 Application in SPSS for factor analysis stages
 Interpretation of factor matrix
 Validation of factor analysis
 Factor Scores
Simple Linear Regression
Correlation vs. Regression
DCOVA
 A scatter plot can be used to show the
relationship between two variables
 Correlation analysis is used to measure the
strength of the association (linear relationship)
between two variables
 Correlation is only concerned with strength of the
relationship
 No causal effect is implied with correlation
Types of Relationships
DCOVA

Linear relationships Curvilinear relationships

Y Y

X X

Y Y

X X
Types of Relationships
DCOVA
(continued)
Strong relationships Weak relationships

Y Y

X X

Y Y

X X
Types of Relationships
DCOVA
(continued)
No relationship

X
Introduction to
Regression Analysis
DCOVA
 Regression analysis is used to:
 Predict the value of a dependent variable based on
the value of at least one independent variable
 Explain the impact of changes in an independent
variable on the dependent variable
Dependent variable: the variable we wish to
predict or explain
Independent variable: the variable used to predict
or explain the dependent
variable
Simple Linear Regression
Model
DCOVA
 Only one independent variable, X
 Relationship between X and Y is
described by a linear function
 Changes in Y are assumed to be related
to changes in X
Simple Linear Regression
Model
DCOVA

Population Random
Population Independent Error
Slope
Y intercept Variable term
Coefficient
Dependent
Variable

Yi  β0  β1Xi  ε i
Linear component Random Error
component
Simple Linear Regression
Model DCOVA
(continued)

Y Yi  β0  β1Xi  ε i
Observed Value
of Y for Xi

εi Slope = β1
Predicted Value
Random Error
of Y for Xi
for this Xi value

Intercept = β0

Xi X
Simple Linear Regression
Equation (Prediction Line) DCOVA
The simple linear regression equation provides an
estimate of the population regression line

Estimated
(or predicted) Estimate of Estimate of the
Y value for the regression regression slope
observation i
intercept
Value of X for

Ŷi  b0  b1Xi
observation i
The Least Squares Method

b0 and b1 are obtained by finding the values of


that minimize the sum of the squared
differences between Y and Ŷ :

min  (Yi Ŷi )  min  (Yi  (b0  b1Xi ))


2 2
Finding the Least Squares
Equation

 The coefficients b0 and b1 , can be found


through the below mentioned formula

 b1 =

 b0 =
Interpretation of the
Slope and the Intercept

 b0 is the estimated average value of Y


when the value of X is zero

 b1 is the estimated change in the


average value of Y as a result of a
one-unit increase in X
Simple Linear Regression
Example
 A real estate agent wishes to examine the
relationship between the selling price of a home
and its size (measured in square feet)
 A random sample of 10 houses is selected
 Dependent variable (Y) = house price in $1000s

 Independent variable (X) = square feet


Simple Linear Regression
Example: Data
House Price in $1000s Square Feet
(Y) (X)
245 1400
312 1600
279 1700
308 1875
199 1100
219 1550
405 2350
324 2450
319 1425
255 1700
Simple Linear Regression
Example: Scatter Plot

House price model: Scatter Plot


450
400
House Price ($1000s)

350
300
250
200
150
100
50
0
0 500 1000 1500 2000 2500 3000
Square Feet
Simple Linear Regression Example
Regression Statistics
Multiple R 0.76211 The regression equation is:
R Square 0.58082
Adjusted R Square 0.52842 house price  98.24833  0.10977 (square feet)
Standard Error 41.33032
Observations 10

ANOVA
  df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000      

  Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
Simple Linear Regression Example:
Graphical Representation

House price model: Scatter Plot and Prediction Line


450
400
House Price ($1000s)

350
Slope
300
250
= 0.10977
200
150
100
50
Intercept 0
= 98.248 0 500 1000 1500 2000 2500 3000
Square Feet

house price  98.24833  0.10977 (square feet)


Simple Linear Regression
Example: Interpretation of bo

house price  98.24833  0.10977 (square feet)

 b0 is the estimated average value of Y when the


value of X is zero (if X = 0 is in the range of
observed X values)
 Because a house cannot have a square footage
of 0, b0 has no practical application
Simple Linear Regression
Example: Interpreting b1

house price  98.24833  0.10977 (square feet)

 b1 estimates the change in the average


value of Y as a result of a one-unit
increase in X
 Here, b1 = 0.10977 tells us that the mean value of a
house increases by .10977($1000) = $109.77, on
average, for each additional one square foot of size
Simple Linear Regression
Example: Making Predictions
Predict the price for a house
with 2000 square feet:

house price  98.25  0.1098 (sq.ft.)

 98.25  0.1098(200 0)

 317.85
The predicted price for a house with 2000
square feet is 317.85($1,000s) = $317,850
Simple Linear Regression
Example: Making Predictions
 When using a regression model for prediction,
only predict within the relevant range of data
Relevant range for
interpolation
450
400
House Price ($1000s)

350
300
250
200
150
100
50 Do not try to
0
extrapolate
0 500 1000 1500 2000 2500 3000
Square Feet
beyond the range
of observed X’s
Measures of Variation
 Total variation is made up of two parts:

SST  SSR  SSE


Total Sum of Regression Sum Error Sum of
Squares of Squares Squares

SST   ( Yi  Y )2 SSR   ( Ŷi  Y )2 SSE   ( Yi  Ŷi )2


where:
=Y
Mean value of the dependent variable
Yi = Observed value of the dependent variable

i
= Predicted value of Y for the given Xi value
Measures of Variation
(continued)

 SST = total sum of squares (Total Variation)


 Measures the variation of the Yi values around their
mean Y
 SSR = regression sum of squares (Explained Variation)
 Variation attributable to the relationship between X
and Y
 SSE = error sum of squares (Unexplained Variation)
 Variation in Y attributable to factors other than X
Lecture Summary
 Simple Linear Regression
 Correlation Vs Regression
 Introduction to Simple Linear Regression
 Simple Linear Regression Model
 Least Square Method
 Interpretation of Model
 Measures of variation

You might also like