0% found this document useful (0 votes)
35 views

Applied Econometrics. Statistics Review. Part 1. Probability

This document provides an overview of key concepts in probability theory, including: 1) Definitions of fundamental probability terms like experiments, outcomes, sample spaces, events, random variables, and probability. 2) Laws of probability including unions, intersections, complements, mutually exclusive events, collectively exhaustive events, and formulas like DeMorgan's laws and the addition rule for probabilities. 3) Key probability concepts like joint, conditional, and independent probabilities, as well as Bayes' law.

Uploaded by

John Stephens
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views

Applied Econometrics. Statistics Review. Part 1. Probability

This document provides an overview of key concepts in probability theory, including: 1) Definitions of fundamental probability terms like experiments, outcomes, sample spaces, events, random variables, and probability. 2) Laws of probability including unions, intersections, complements, mutually exclusive events, collectively exhaustive events, and formulas like DeMorgan's laws and the addition rule for probabilities. 3) Key probability concepts like joint, conditional, and independent probabilities, as well as Bayes' law.

Uploaded by

John Stephens
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Applied Econometrics. Statistics Review.


Part 1. Probability

1. Terminology

• Experiment: a process that results in uncertain outcomes.

• Outcome: a possible result of the experiment.

• Sample Space: a collection of all possible outcomes of an experiment.

• Event: a subset of the sample space. An event can consist of just one outcome.

• Random Variable: an experiment with numerical outcomes or outcomes that can be


mapped to numbers.

– e.g., rolling a die and counting the number of dots.


– e.g., flipping a coin and seeing if it comes up heads or tails (we can treat heads
as “0” and tails as “1”).

• Probability: a number from zero to one that reflects how likely an event is to occur.
We denote it as Pr(·)

– e.g., when rolling a six-sided die, each side coming up has a probability of 1/6.
– e.g., when flipping a coin either side has a 50% chance of coming up, that is the
probability of 1/2.

• If you take the probabilities of all the potential outcomes for a random variable and
sum them up you will get 1. The probability of a sample space is one.


Prepared by George Orlov and Douglas McKee, Cornell University, 2018. Do not distribute without the
authors’ permission.

1
2. Laws of Probability

• Let A and B be two events in sample space S.

• ∪ denotes a union: A ∪ B denotes outcomes in A or B together (meaning everything


in both A and B). This is a logical “or”.

• ∩ denotes an intersection: A ∩ B denotes outcomes that are in A and B at the same


time. This is a logical “and”.

• A and B are said to be mutually exclusive if Pr(A ∩ B) = 0.

• A and B are said to be collectively exhaustive if A ∪ B = S.

• Ā denotes the complement of A: A ∪ Ā = S. Note that this is the same notation


as used to denote the sample mean (i.e. X̄) so, some texts use Ac to denote the
complement.

• The following formulae are referred to as DeMorgan’s Laws:


– A ∪ B = Ā ∩ B̄ - the complement of a union is the intersection of complements.
– A ∩ B = Ā ∪ B̄ - the complement of an intersection is the union of complements.

• A and Ā are both mutually exclusive and collectively exhaustive.

• Pr(A ∪ B) = Pr(A) + Pr(B) − Pr(A ∩ B)

• Pr(A) = 1 − Pr(Ā).

• Joint probability is a probability that both A and B happen together (i.e., Pr(A ∩
B)).

• Conditional probability is a probability of an event given that another event has


occurred. We denote it as Pr(A|B) (probability of A given B).
Pr(A ∩ B)
Pr(A|B) =
Pr(B)

2
• The above equation can be reworked to what is referred to as Bayes’ Law1 :

Pr(B|A) · Pr(A)
Pr(A|B) =
Pr(B)

• Events A and B are said to be independent if Pr(A ∩ B) = Pr(A) · Pr(B).

1
Also referred to as Bayes’ Rule

You might also like