0% found this document useful (0 votes)
53 views3 pages

Fe Pearson's: SD D N

The document discusses various statistical tests and formulas, including: 1) Formulas for t-tests to test for differences between dependent and independent samples, including the t-test for dependent samples and independent samples t-test. 2) A summary table outline for a one-way ANOVA, including sources of variation, sum of squares, degrees of freedom, mean squares, and F values. 3) Formulas for descriptive statistics, including the mean, median, mode, range, standard deviation, variance, z-scores, t-scores, and coefficient of variation.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views3 pages

Fe Pearson's: SD D N

The document discusses various statistical tests and formulas, including: 1) Formulas for t-tests to test for differences between dependent and independent samples, including the t-test for dependent samples and independent samples t-test. 2) A summary table outline for a one-way ANOVA, including sources of variation, sum of squares, degrees of freedom, mean squares, and F values. 3) Formulas for descriptive statistics, including the mean, median, mode, range, standard deviation, variance, z-scores, t-scores, and coefficient of variation.
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Chi-square (x2) test of independence = fe = Pearsons

Testing significance of reln

Spearman rank correlation

Hypothesis testing t-value > t-crit : reject Ho p-value > alpha : accept Ho NOTE: df when testing for r is n-2 z-test t-test a.

t-test for dependent samples

td =

(d)( n ) Sd

where Sd =

(d - d ) ; d : mean of the difference; d = difference


n -1

T-test for independent samples (tid)

t id =

M1 - M 2 (n1 - 1)(s1 )2 + (n2 - 1)(s2 )2 1 1 ( + ) n1 + n2 - 2 n1 n2

with degrees of freedom : dftid = n1 + n2 2 Standard deviation -= Sdud = (x-M)2 // n-1 F-test

Summary table to one-way ANOVA:

Sources of variation (SV) Treatment

Sum of squares (SS)

Degrees of freedom (DF) K -1

Mean squares (MS)

Fcomputed

Fcritical

p-value

SSTR

Error

SSE

nT - K

Total

SST

nT - 1

In SLR, the General Formula is: where a = intercept; and b = slope such that, and
( )

Slovins Formula ____N____ N = 1 + N(e)2

Where: n = sample size 1 = constant (whole) E = margin of error

SAMPLING TECHNIQUES 1. Systematic Random Sampling N K=n Grouped Data Distribution: 1. R= Highest Score Lowest Score 2. cs - 5 to 15 3. (i); i= R/CS Mean Conventional __ X = x i
i=1

Contemporary M = x n

n Median (Mdn) middlemost Mode (Mo) For Grouped Data 1) M = am + (fd)i n Where; am = assumed mean;

fd = summation of the product of frequency and deviation n = total number of samples i = interval Mdn = ll + (n/2 F) i f where; ll = lower limit of the median classes n/2 = half-sum F = estimated cumulative frequency f = real frequency i = interval Mode - lmo + ( 1 ) i 1+ 2 Where; lmo = lower limit of the modal class 1 = difference between modal frequency (mf) and frequency one-step higher 2 = difference between the mf and the f one-step lower of it in ascending order Range (R) = R = HS LS Standard Deviation (SD) 2 Ungrouped: Sdud = (x-M) // n-1 2 2 For grouped data: SDgd = i * fd /n-1 (fd) /n(n-1) Variance (SD2) = average of the square deviations z-score = z = x-M/SD t-score = t = 10z + 50 Coefficient of Variation (CV) CV = SD ---- x 100% M MEASURES OF RELATIVE POSITION Quantiles = Mdn = n/2 Qk=kn/4 Dk =kn/10 Pk =kn/100

You might also like