0% found this document useful (0 votes)
35 views62 pages

Correlation T Test ANOVA

Uploaded by

Jenrick Prieto
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views62 pages

Correlation T Test ANOVA

Uploaded by

Jenrick Prieto
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 62

EXPLORING

6.53

RELATIONSHIPS
Correlational Analysis:
• Refers to the process of establishing relationships
between two variables.
• When one is considering relationship between
two variables this is called bivariate
correlation; if more than two, multivariate
correlation.
• Correlational analysis gives the following
information:
• the direction of the relationship, negative or
positive;
• the strength or magnitude of the relationship
between two variables (-1 to 0 to +1)
• Warning:
• no proof of causality
• cannot assume x causes y
Parametric: Pearson Product Moment of Correlation
or Pearson’s r
- when the distribution is normal
- relationship between two variables are linear
- interval data
Assumptions in Correlation:

Assumption #1: Your two variables should be


measured at the interval or ratio level (they are
continuous). Ex. time, IQ scores, weight, height
Assumptions in Correlation:

Assumption #2: There is a linear relationship


between your two variables. You can check this by
creating a scatter plot in SPSS.
Assumptions in Correlation:

Assumption #3: There should be no significant


outliers. Outliers are simply single data points
within your data that do not follow the usual
pattern.
Assumptions in Correlation:

Assumption #4: Your variables should be


approximately normally distributed.
How to analyze correlation in SPSS?

Analyze -> Correlate -> Bivariate


The strength of the relationship
(Cohen, 2013)

+/- 0.00 – 0.09 Zero


+/- 0.10 – 0.29 Weak
+/- 0.30 – 0.69 Moderate
+/- 0.70 – 1.00 Strong

Less than .05, it is significant


(there is a relationship)
How to interpret the results?

The relationship between IQ scores and General Weighted


Average (GWA) of the students was investigated using
Pearson product moment correlation. Preliminary analysis
were performed to ensure no violation of the assumptions of
normality and linearity. There was a significant strong,
positive correlation between the two variables [r=.871,
p<.05], with high IQ scores associated with high GWA.
EXPLORING
6.53

DIFFERENCES
Dates to Remember:

Dec. 3 – lecture on exploring differences/lab activity


Dec. 6 – quiz
Dec. 10 – finalization of requirements/ completion of missing
activities
Dec. 13 – final exam and submission of final requirement
Exploring Differences:

• Basically, the purpose of exploring differences is


to test hypothesis.
T-Test:
• A statistical test that is used to compare the
means of two groups. It is often used in
hypothesis testing whether two groups are
different from one another.
T-Test:
• Independent-samples t-test: used when you want
to compare the mean scores of two different
groups of people or conditions.

Ex. Sex (male or female), Religion (Catholic or


non-Catholic)
T-Test:
• Paired-samples t-test: used when you want to
compare the mean scores for the same group of
people on two different occasions, or you have
matched pairs.

Ex. Pre-test and post-test


When to use a T-Test:

✓ If your data is normal or normally distributed.


✓ Homogeneity of variance.
When to use a T-Test:

• Homogeneity of variance (Levene’s test)


- assumes that samples are obtained from
populations of equal variances, this means that
the variability of scores for each of the groups is
similar.
Note: Levene’s test should
not be significant.
Sig. value > .05, not significant
Sig. value < .05, significant
Summary of independent-samples t-test:

Example of research question:

Is there a significant difference in the mean life


satisfaction scores for males and females?
Summary of independent-samples t-test:

What do you need: Two variables

• One categorical, independent variable (e.g.


males/females)
• One continuous, dependent variable (e.g. life
satisfaction scores)
Summary of independent-samples t-test:

What does it do:

An independent-samples t-test will tell you whether there


is a statistically significant difference in the mean scores
for the two groups (that is, do males and females differ
significantly in terms of their life satisfaction levels)
Summary of independent-samples t-test:

What does it do:

In statistical terms, you are testing the probability that the


two sets of scores (for males and females) came from the
same population.
Summary of independent-samples t-test:

Non-parametric alternative:

Mann-Whitney Test
How to run independent-samples t-test in SPSS:

Analyze -> Compare means -> Independent Samples T-


test
Sig. value > .05, not significant
Sig. value < .05, significant
Calculate the effect size:

• Effect size statistics provide an indication of the


magnitude of the differences between your groups.

*eta squared
Eta squared:

Eta squared = t2
t2 + (N1 + N2 – 2)
Eta squared = -1.3392
-1.3392 + (21 + 29 – 2)
Eta squared = 0.036 small effect
Expressed as percentage,
Guidelines (Cohen, 1988) (multiple eta square by 100),
.01 = small effect only 3.6 % of the variance in
.06 = moderate effect life satisfaction is explained
by sex.
.14 = large effect
Presenting the results:

An independent-samples t-test was conducted


to compare the life satisfaction scores for males and
females. There was no significant difference in scores for
males (M=3.343, SD=.266), and females [M=3.431, SD=.200;
t(48)=-1.339, p=.05]. The magnitude of the differences in
the means was small (eta squared=.036).
t(df)=t, p=
How to run Mann-Whitney U Test in SPSS:

Analyze -> Non-parametric -> 2 Independent Samples


Sig. value > .05, not significant
Sig. value < .05, significant
One-way analysis of variance:

• Involves one independent variable (referred to as a


factor), which has a number of different levels. These
levels correspond to the different groups or conditions.
Summary for one-way between groups ANOVA with
post-hoc tests:

Example of research question:

Is there a significant difference in life satisfaction


scores when grouped by socioeconomic status?
Summary for one-way between groups ANOVA with post-hoc
tests:

What do you need:

• One categorical independent variable with three or more


distinct categories. This can also be a continuous variable that
has been recoded to give three equal groups.
• One continuous dependent variable
Summary for one-way between groups ANOVA with post-hoc
tests:

What does it do:

• One-way ANOVA will tell you whether there are significant


differences in the mean scores on the dependent variable,
across three groups. Post-hoc tests can then be used to find
out where these differences lie.
Summary for one-way between groups ANOVA with post-hoc
tests:

Non-parametric alternative:

Kruskal-Wallis Test
How to run One-way ANOVA between groups:

Analyze -> Compare means -> One-way ANOVA


IV -> Factor
Options -> Descriptive, Homogeneity of variance and
Means Plot
Post Hoc -> LSD
Calculate the effect size:

Eta squared = Sum of squares between-groups


Total sum of squares
Calculate the effect size:

Eta squared = .154 Guidelines (Cohen, 1988)


2.628 .01 = small effect
.06 = moderate effect
.14 = large effect
Eta squared = 0.056
Presenting the results:
A one-way between-groups analysis of variance was conducted to
explore the impact of socioeconomic status on levels of life
satisfaction. Results show that there is no significant difference at
the p>.05 level in life satisfaction scores when grouped by
socioeconomic status [F(2, 47)=1.458, p=.243]. The effect size,
calculated using eta squared was .056 which is considered moderate.
Further analysis using post-hoc tests confirm no differences
between groups.
How to run Kruskal-Wallis Test in SPSS:

Analyze -> Non-parametric -> K Independent Samples

You might also like