Student Chapter 14 - Regression Without SE - Tagged
Student Chapter 14 - Regression Without SE - Tagged
• Linear Regression
– Predictor variable is used to predict a case’s
score on another variable, and the prediction
equation takes the form of a straight line.
– Regression equation helps one arrive at
better decisions overall.
– Simple linear regression (only one predictor
variable) should be used only with a
statistically significant Pearson r.
• What was Sigmund Freud’s favorite
statistical procedure?
– Regression
Using a Regression Line for Prediction
• Relationship between
Fahrenheit and Celsius
– Perfect correlation, r = 1.00.
– All points in the scatterplot fall on
a straight line.
– Six data points are known.
– What about all the other possible
Fahrenheit values? Temperature Measured in
• If an object’s temperature is Fahrenheit and Celsius
measured and found to be 86°
Fahrenheit, what would it be in
Celsius?
Using a Regression Line for Prediction
(continued)
• Regression Line
– Best-fitting straight line for
predicting Y from X
′
𝑌 =𝑏𝑋 +𝑎
The Linear Regression Equation
(continued)
• Understanding Slope
– Slope
• Tilt of the line; rise over run; how much up or down
change in Y is predicted for each 1-unit change in
X:
– If the slope is positive, the line is moving up and to the
right.
– If the slope is negative, the line is moving down and to
the right.
– If the slope is zero, the line is horizontal.
Example 1: Marital Satisfaction Study
Step 5
The Linear Regression Equation
(continued)
• Formula for Calculating Slope
𝑏=𝑟
( ) 𝑆𝑌
𝑆𝑋
Example 1: The Linear Regression
Equation (continued)
()
• Dr. Paik’s Marital Satisfaction
𝑆𝑌
Study
– r = .76, SY = 0.86, SX = 11.49
𝑏=𝑟
– Value of the slope, 0.06
• For every 1-point
𝑆𝑋
increase in a husband’s
level of gender role
flexibility, there is a
predicted increase of
0.06 points in the wife’s
level of marital
satisfaction
The Linear Regression Equation
(continued)
• Understanding the Y-Intercept
– Y-intercept
• Indicates where the regression line would pass
through the Y-axis
• If Y-intercept is positive, line passes through the Y-
axis above zero.
– If Y-intercept is negative, line passes through the Y-axis
below zero.
– The bigger the absolute value of the Y-intercept, the
further away from zero the regression line passes
through the Y-axis.
The Linear Regression Equation
(continued)
• Formula for the Y-Intercept
𝑎=𝑀 𝑦 −𝑏 𝑀 𝑋
Example 1: The Linear Regression
Equation (continued)
• Dr. Paik’s Marital Satisfaction Study
– b = 0.0568, MY = 2.00, MX = 25.00
– Regression formula, Equation 14.1
𝑎=𝑀 𝑦 −𝑏𝑀 𝑋
′
𝑌 =𝑏𝑋 + 𝑎
Example 1: The Linear Regression
Equation (continued)
• Predicting Y
– Dr. Paik needs to select an X
′
value for which to predict Y score
𝑌 =𝑏𝑋+𝑎
• Selects gender role flexibility score
of 30 and substitutes that for X
• A man with a gender role flexibility
score of 30 will have a partner who
rates her level of marital
satisfaction as 2.38
• Marital satisfaction rated on a 4-
point scale like GPA
• She would rate her marriage at the
C+ level
Example 1: The Linear Regression
Equation (continued)
• Drawing the Regression Line
– Helps to highlight the relationship between the two variables
– Can be drawn once two points are known
′ ′
𝑌 =𝑏𝑋+𝑎 𝑌 =𝑏𝑋+𝑎
Example 1: Regression Line
• Regression Line for Marital
Satisfaction
– Finding Y′ for the points at
two ends of the range of X
scores allows a researcher to
draw a regression line.
– Any two points can be
connected with a straight line.
– Regression line should only
be used to predict Y′ for the Regression Line for Predicting
range of X scores used to Marital Satisfaction
derive the regression
equation.
R2 the proportion of variance that two measures have in common-
overlap determines relationship-explained variance
Example 2: Calculating Simple
Regression
• A therapist wants to know if there is a relationship between
perceived stress and the magnitude of the stress symptoms
displayed? They as six clients to report their stress level using
a stress questionnaire and the therapies makes not of their
symptom levels.
Case X Y
Stress Symptom
1. 30 99
2. 27 94
3. 9 35
4. 20 70
5. 3 30
6. 15 45
M= 17.3333 62.1667
s= 10.4051 29.9961
Example 2: Stress Study Step 5
Example 2: The Linear Regression
Equation (continued)
• The Therapists’ Stress Study
– r = .9757, SY = 29.9961 , SX =
10.4051
– Value of the slope
• For every 1-point
increase in stress,
there is a predicted
increase of
points in perceived
symptom
Example 2: The Linear Regression
Equation (continued)
• The Therapists’ Stress Study
– b = 2.8128, MY = 62.1667, MX = 17.3333
– Regression formula, Equation 14.1
Example 2: The Linear Regression
Equation (continued)
• Drawing the Regression Line
– Helps to highlight the relationship between the two variables
• The therapists’ stress study
– What if a new client came in and had a stress score of 45, what would
their symptom level be?
– What if a new client came in and had a stress score of 38?
Example 3: Simple Regression Line
• R2 in Multiple Regression
– r2, the percentage of variability in the dependent
variable that is accounted for by the independent
variable(s), is called R2 in multiple regression
– Better prediction means a higher percentage of
variability is accounted for with multiple
regression than with simple regression
– More powerful technique than simple regression
R2 the proportion of variance that two measures have in common-
overlap determines relationship-explained variance
Multiple Regression (continued)
• Example: Admitted Class Evaluation Service (ACES)
– Predicting first-year GPA
• Equation is developed from a first-year class to predict first-year
GPA
• College Board examines four variables; the Pearson r correlation
coefficients for each of these variables predicting GPA by itself are:
– SAT reading test, r = .42
– SAT writing test, r = .42
– SAT math test, r = .39
– High school class rank, r = .52
• When the four College Board variables are combined to predict
GPA, correlation climbs to R = .57
• Percentage of variance explained changes from 27.04% to 32.49%
Multiple Regression (continued)
′
𝑃 𝐴 =( 𝑆𝐴𝑇 ReadingScore × 𝑊𝑒𝑖𝑔h𝑡 ReadingScore ) + ( 𝑆𝐴𝑇 WritingScore × 𝑊𝑒𝑖𝑔h𝑡WritingSco
Multiple Regression (continued)
′
𝑃𝐴 =(𝑆𝐴𝑇 ReadingScore ×𝑊𝑒𝑖𝑔h𝑡ReadingScore )+(𝑆𝐴𝑇WritingScore×𝑊𝑒𝑖𝑔h𝑡WritingSco