0% found this document useful (0 votes)
24 views

Lecture 1 - Intro

Software outputs for analyzing regression models will also be demonstrated.

Uploaded by

d5zb58j5dm
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

Lecture 1 - Intro

Software outputs for analyzing regression models will also be demonstrated.

Uploaded by

d5zb58j5dm
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Advanced Quants Decision Making – Lecture 1 – January 8

Agenda:
 Single linear regression models
 Inference for regression (hypothesis test, and confidence interval for the regression
slope)
 ANOVA; F-statistic; R-squared
 Software outputs

Introduction
 Response variable or dependent: variable being predicted.
 Predictor variables or independent variables: variables being used to predict the value of
the dependent variable.
 Simple (single) linear regression: a regression analysis for which any one unit change in
the independent variable, x, is assumed to result in a change in the dependent variable,
y.
 Multiple linear regression: a regression analysis involving two or more independent
variables

Correlation

Σ zx z y
r=
n−1
y− y
z=
s

Simple Linear Regression Model


 Regresssion Model:

o The equation that describes how y is related to x and an error term

o Parameters: the characteristics of the population, β oand β 1


o Random variable: Error term, ε
o The error term accounts for the variability n y that cannot be explained by the
linear relationship between x and y.
 Estimated Regression Equation
o The parameter values are usually not known and must be estimated using
sample data

 Substituting the values of the sample statistics b oand b 1for β oand β 1in the regression
equation and dropping the error term, we obtain the estimated regression for simple
linear regression.

Least Squares Method


 Error or Residual is
o Denoted as e i= y i− ^yi
o Hence,

o We are finding the regression that minimizes the sum of squared errors.

Estimating b oand b 1
 Straight lines can be written as
 We can find the slope of the least squares line using the correlation and the standard
deviations

 To find the intercept of our line, we use the means. If our line estimates
the data, then it should predict the y-bar for the x-value of x-bar.

You might also like