0% found this document useful (0 votes)
5 views

Module 2

Module 2 covers basic probability concepts, including sample space, events, and probability axioms. It introduces random variables, probability distributions, expected value, variance, and statistical inference methods like hypothesis testing and confidence intervals. The module also discusses regression analysis for modeling relationships between variables, emphasizing its applications in various fields.

Uploaded by

Racel Cagnayo
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Module 2

Module 2 covers basic probability concepts, including sample space, events, and probability axioms. It introduces random variables, probability distributions, expected value, variance, and statistical inference methods like hypothesis testing and confidence intervals. The module also discusses regression analysis for modeling relationships between variables, emphasizing its applications in various fields.

Uploaded by

Racel Cagnayo
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Module 2: Basic Probability Concepts

1. Basic Probability Concepts


 Sample Space (S):
o The set of all possible outcomes of a random experiment.

o Example: Flipping a coin: S={Heads,Tails}

o Example: Rolling a six-sided die: S={1,2,3,4,5,6}

 Event (E):
o A subset of the sample space.

o Example: Flipping a coin and getting Heads: E={Heads}

o Example: Rolling an even number on a die: E={2,4,6}

 Probability Axioms:
o Axiom 1 (Non-negativity): For any event E, P(E)≥0. (Probability is never
negative.)
o Axiom 2 (Total Probability): P(S)=1. (The probability of the entire sample space
is 1.)
o Axiom 3 (Additivity for Mutually Exclusive Events): If E1 and E2 are mutually
exclusive (they cannot both occur), then P(E1∪E2)=P(E1)+P(E2).
 Example: event of rolling a 1 and event of rolling a 2 on a die are mutually
exclusive. So the probability of rolling a 1 or a 2 is the sum of the individual
probabilities.
 Example:
o Experiment: Rolling a fair six-sided die.

o Sample space: S={1,2,3,4,5,6}

o Event A: Rolling an even number. A={2,4,6}

o Probability of event A: P(A)=Total number of outcomes in SNumber of outcomes in A


=63=21
2. Random Variables and Probability Distributions
 Random Variable (X):
o A variable whose value is a numerical outcome of a random phenomenon.

o Discrete Random Variable: Takes on a countable number of values.

 Example: Number of heads in 3 coin flips.


o Continuous Random Variable: Takes on any value within a range.

 Example: Height of a person.


 Probability Distribution:
o Describes the probabilities of all possible values of a random variable.
o Discrete Probability Distribution:

 Probability mass function (PMF): P(X=x)


 Example: Binomial distribution (number of successes in a fixed number of
trials).
 If you flip a fair coin 3 times, the number of heads (X) can be 0,1,2, or
3. The PMF describes the probability of each of these outcomes.
o Continuous Probability Distribution:

 Probability density function (PDF): f(x)


 Example: Normal distribution (bell curve).
 The heights of people in a large population often follow a normal
distribution.
3. Expected Value, Variance, and Standard Deviation
 Expected Value (E[X] or μ):
o The average value of a random variable.

o Discrete: E[X]=∑x⋅P(X=x)

o Continuous: E[X]=∫x⋅f(x)dx

o Example: If you roll a fair 6 sided die, the expected value is:
E[X]=(1/6)∗1+(1/6)∗2+(1/6)∗3+(1/6)∗4+(1/6)∗5+(1/6)∗6=3.5
 Variance (Var(X) or σ²):
o Measures the spread or dispersion of a random variable's values.

o Var(X)=E[(X−E[X])2]

o Discrete: Var(X)=∑(x−E[X])2⋅P(X=x)

o Continuous: Var(X)=∫(x−E[X])2⋅f(x)dx

 Standard Deviation (σ):


o The square root of the variance.

o σ=Var(X)

o It's in the same units as the random variable, making it easier to interpret.

4. Statistical Inference: Hypothesis Testing and Confidence Intervals


 Hypothesis Testing:
o A method for making decisions based on data.

o Involves setting up a null hypothesis (H₀) and an alternative hypothesis (H₁).

o Example: Testing if a new drug is effective.

 H0: The drug has no effect.


 H1: The drug has an effect.
o We use sample data to calculate a test statistic and determine if there is enough
evidence to reject the null hypothesis.
 Confidence Intervals:
o A range of values that is likely to contain a population parameter with a certain level
of confidence.
o Example: A 95% confidence interval for the mean height of a population.

o If we calculate a 95% confidence interval from many samples, 95% of those


intervals should contain the true population mean.
 Example:
o Hypothesis test: a factory produces light bulbs, and the factory claims the bulbs last
1000 hours on average. A sample of bulbs is tested, and the sample average is 950
hours. A hypothesis test can determine if the difference of 50 hours is statistically
significant.
o Confidence Interval: Using the sample data, a 95% confidence interval for the
average bulb life could be calculated. That interval would provide a range of values
where the true average bulb life likely resides.
5. Regression Analysis
 Regression Analysis:
o A statistical technique used to model the relationship between a dependent variable
and one or more independent variables.
o Linear Regression: Models a linear relationship.

 y=mx+b
 Example: Predicting house prices based on square footage.

o Multiple Regression: Models a relationship with multiple independent variables.

 Example: Predicting exam scores based on study time, attendance, and prior
grades.
 Example:
o A study wants to see if there is a relationship between the amount of fertilizer used
and crop yield.
o Regression analysis can be used to create a model that predicts crop yield based on
the amount of fertilizer used.
o The regression model will provide information about the strength and direction of
the relationship.
These are the fundamental concepts of probability theory and statistics. They form the basis for
many applications in fields such as science, engineering, finance, and data science.

You might also like