0% found this document useful (0 votes)
40 views21 pages

04a Prob and Distributions

The document discusses probability and probability distributions. It covers types of probability definitions including classical, relative frequency, and subjective probability. It also discusses types of probability distributions for discrete and continuous random variables. Key concepts covered include probability, probability rules, joint and conditional probability, Bayes' rule for calculating posterior probabilities from prior probabilities and new evidence. An example on calculating probabilities from a contingency table is provided to illustrate these concepts.

Uploaded by

sanchit jain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
40 views21 pages

04a Prob and Distributions

The document discusses probability and probability distributions. It covers types of probability definitions including classical, relative frequency, and subjective probability. It also discusses types of probability distributions for discrete and continuous random variables. Key concepts covered include probability, probability rules, joint and conditional probability, Bayes' rule for calculating posterior probabilities from prior probabilities and new evidence. An example on calculating probabilities from a contingency table is provided to illustrate these concepts.

Uploaded by

sanchit jain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Probability and probability distributions

Probability and its applications


Types and properties of distributions
Probability

• A variable can take multiple values (at least two different values).
• Some outcomes are more likely than others. The desirable outcomes are called
events, and the process of obtaining the outcomes is called an experiment.

Types of definitions:
1. Classical probability – number of outcomes where the events occur divided by the
total number of possible outcomes.
2. Relative frequency of occurrence.
3. Subjective probability.

2
Probability

• Using the idea of relative frequency, we can define probability:


The probability of a particular outcome is the proportion of times that outcome would
occur in a long run of repeated observations.
• The probability distribution of the variable lists the possible outcomes together with
their probabilities.

3
Probability

• The probability distribution for a discrete variable, X, assigns a probability to each


possible value of the variable.
• Each probability number is between 0 and 1, and the sum of all the probability
numbers equals 1. (Why?)
• Let x denote a possible outcome of the variable X. Then we denote the probability of
finding the value x as Pr(X = x).
• Example: Consider the experiment of tossing a coin. Variable of interest (X) is what
we get on the toss. Here, X can two values, viz. “Head” and “Tail”. Assuming the coin
to be fair, we have equal likelihood of getting a “Head” or a “Tail.”
• So, Pr(X = Head) = Pr(X = Tail) = 0.5. Why?

4
Basics of probability

• Venn diagram

Basic probability rules


• Union of events
• Intersection of events
• Marginal probability
• Mutually exclusive events
• Statistically independent events

5
Quick Check

• You receive the newspaper of the day definitely after 5 A.M. and, on or before 8
A.M. everyday.
• Let event A be the event that the newspaper is received definitely after 6 A.M. Let
event B be the event that the newspaper is received before 7 A.M.

• Describe the event that is complement of A.


• What is intersection of event A and B?
• Are events A and B mutually exclusive? Collectively exhaustive?

6
Joint probability

• Consider A and B as two events in an experiment.


• Joint distribution of A and B means: probability of observing event A, AND event B.
Denoted by: Pr(A∩B), or some textbooks have this as Pr(A, B).

7
Conditional probability

• Let A and B be two events.


• Pr(A|B) is the conditional probability of event A happening given that B has already
occurred.
• Baye’s rule:

• If events A and B are independent, then Pr(A|B) = Pr(A).


• Hence, from Baye’s rule:

8
Example

• Consider a B-school which shortlisted 1200 candidates (960 men and 240 women)
for its post-graduate management program. Out of these, 324 candidates were
given offer letters for admission. The data is included here:
Men Women Total
Contingency
Offers made 288 36 324 table
Not offered 672 204 876
Total 960 240 1200
• After reviewing the record, a women’s forum raised the issue of gender
discrimination on the basis that 288 male candidates were offered admission against
only 36 female candidates.

9
Example

• To this, the B-school management replied that it was not a case of discrimination,
but was because of the fact that only 240 women candidates appeared for the
examination.
Let us review the case using probability.
• Let M be the event that the candidate is a male.
• Let W be the event that the candidate is a woman.
• Let A be the event that the candidate is offered admission.
• Let Ac be the event that the candidate is not offered admission.
(Event Ac is called compliment of event A.
One can see that Pr(A) + Pr(Ac) = 1.)

10
Example

• Probability that a randomly observed candidate is a male and is offered the


admission.
Pr(M∩A) = 288/1200 = 0.24
• Probability that randomly observed candidate is a male and is not offered the
admission. Pr(M∩Ac) = 672/1200 = 0.56
• Similarly,
Pr(W∩A) = 36/1200 = 0.03
Pr(W∩Ac) = 204/1200 = 0.17

11
Example

• In terms of probabilities, the previous table can now be rewritten as:

Man Woman Total


Offers made 0.24 0.03 0.27
Not offered 0.56 0.17 0.73
Total 0.8 0.2 1.0

• Joint probabilities (of what?) appear in the main body of the table (e.g. 0.24, 0.03).
• Marginal probabilities (of what?) appear in the margin of the table (e.g. 0.8, 0.2).

12
Example

• What will be Pr(A|M)?


• First of all, what does this mean?
• This conditional probability tells us that we are concerned with admission status of
only males!
• We know that out of the 960 male candidates, 288 were offered admission. So
probability that a male candidate is offered admission will be 288/960 = 0.3.
• Also observe that:

13
Example

• Now, the numerator, 0.24 is the joint probability of events A and M. that is Pr(A∩M)
= 0.24. And 0.8 is the marginal probability of the event M, i.e. Pr(M) = 0.8.

• This is, precisely, the definition of conditional probability.

• Back to the problem at hand:

14
Example: Conclusion

• The probability of admission offer given the candidate is a male is 0.3, twice of 0.15
probability of admission offer given the candidate is a woman.
• Although the use of conditional probability does not, in itself, prove discrimination,
there is support for the argument!

15
Baye’s rule

• Often we have initial guesses about an event from which we can calculate prior
probabilities, using the usual probability theory.
• Then, from sources such as data collection, sample, product field tests, we obtain
more information about these events.
• Given, this new information, we can update our prior beliefs by calculating revised
probabilities – this is called the posterior probability.
• Baye’s theorem is used to calculate the posterior probability if we have the initial
belief (probability) and the additional sample information.

16
Baye’s rule

• Suppose that a manufacturer receives same raw material from two different
suppliers S1 and S2.
• Currently 65% of the raw material comes from S1 and remaining, 35%, comes from
S2.
• Also, suppose that from the historical data available with the quality assurance
department, we know that S1 has 98% of the supplied raw material of good quality
and S2 has 95% of the raw material of good quality.
• That is, the probability of a “Good” quality raw material given that the supplier is S1
is, Pr(G|S1) = 0.98. And for the second supplier, this probability is: Pr(G|S2) = 0.95.

17
Baye’s rule
• What is the probability of the raw material being supplied by S1 and it being good?
• Joint probability, of course!
• This can be calculated using the Baye’s formula.
Pr(S1, G) = Pr(S1 ∩ G) = Pr(S1)*Pr(G|S1) = 0.65*0.98 = 0.637
Pr(S2, G) = Pr(S2 ∩ G) = Pr(S2)*Pr(G|S2) = 0.35*0.95 = 0.3325
• Now, knowing all this information so far, suppose the manufacturer inspects the
incoming raw material on receipt and finds a bad quality material.
• He wants to know the supplier who needs to be contacted to complain!

18
Baye’s rule

• We are interested in the posterior probability that a particular supplier is guilty of


supplying bad quality product given that we have bad quality raw material at our
doorstep – Pr(S1|B) or Pr(S2|B).
• This is an application of Baye’s theorem – finding posterior probability given some
initial facts and numbers.
• From Baye’s formula we know that:

19
Baye’s rule

• What is Pr(B)?
• That is the probability of receiving a bad quality raw material.
• Now bad quality raw material can from supplies of S1 or S2.
• That is, the event B can occur with S1 or with S2.
Pr(B) = Pr(S1 ∩ B) + Pr(S2 ∩ B)
• But Pr(S1 ∩ B) = Pr(S1)*Pr(B|S1), and
• Pr(S2 ∩ B) = Pr(S2)*Pr(B|S2)

20
Baye’s rule

• Significance: Find posterior probabilities using prior information.


• Notice that we use Pr(B|S1) to find Pr(S1|B).

21

You might also like