0% found this document useful (0 votes)
38 views79 pages

Statistics For Business Economics Prob Theory Unit 2 (A)

Uploaded by

krishverma542006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views79 pages

Statistics For Business Economics Prob Theory Unit 2 (A)

Uploaded by

krishverma542006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 79

B.A (Hons.

) Business Economics

STATISTICS FOR BUSINESS


ECONOMICS

Probability Theory

Dr. Vijeta Pundir


Maharaja Agrasen College
Concepts of Probability theory
• Statistics and Probability theory constitutes a branch of mathematics
for dealing with uncertainty

• Probability theory provides a basis for the science of statistical


inference from data

• The probability of an event is a numerical value that measures the


likelihood that the event can occur
Why study probability ?
• Nothing in life is certain. In everything we do, we gauge the chances of successful outcomes, from
business to medicine to the weather

• A probability provides a quantitative description of the chances or likelihoods associated with


various outcomes

• It provides a bridge between descriptive and inferential statistics


• Suppose I know exactly the proportions of car makes in our country. Then I can find the
probability that the first car I see in the street is a Honda. This is probabilistic reasoning
as I know the population and predict the sample

• Now suppose that I do not know the proportions of car makes in India, but would like to
estimate them. I observe a random sample of cars in the street and then I have an
estimate of the proportions of the population. This is statistical reasoning
• In Descriptive statistics, we use graphs and numerical measures such as
frequencies, mean, median, standard deviations, quartiles etc. to describe data
sets which were usually samples.

• We measure “how often” a variate occurs in the sample using frequency or


more appropriately relative frequency ( especially when comparisons have to be
made between samples.

Relative
Relative frequency
frequency == f/n
f/n

• As n gets larger,

Sample Population
And “How often”
= Relative frequency Probability
Important Definitions

• Experiment : any process or procedure for which more than one outcome is possible

• Sample Space
The sample space S of an experiment is a set consisting of all of the possible experimental
outcomes

• An event is an outcome of an experiment, usually denoted by a capital letter.

• The basic element to which probability is applied

• When an experiment is performed, a particular event either happens, or it doesn’t!


An Example of an Experiment: Record an age
A: person is 30 years old
B: person is older than 65

Another Experiment: Toss a die


A: observe an odd number
B: observe a number greater than 2

• An event that cannot be decomposed is called a simple event. It comprises of exactly one
outcome. Denoted by E with a subscript.

• Each simple event will be assigned a probability, measuring “how often” it occurs.

• The set of all simple events of an experiment is called the sample space, S.

• An Event is Complex if it consists of more than one outcome.


Empirical Approach to defining Probability
Axiomatic approach to defining probability
• The probability of an event is a non-negative real
number and it’s definition is based on the three
axioms

P(A1 + A2 + A3+…….An) =
Concepts
Complement of an Event
The event A, the complement of event A, is the event consisting of everything in
the sample space S that is not contained within the event A. In all cases
P ( A)  P ( A)  1

EXAMPLES
1. A fair die is thrown then the probability of getting an even score is
P( even ) = { an even score is recorded on the roll of a die }
= { 2,4,6 }
1 1 1 1
P (even)  P (2)  P(4)  P(6)    
2. For a pair of fair die, If 6 6 6 2
P(A) = { the sum of the scores of two dice is equal to 6 }
= { (1,5), (2,4), (3,3), (4,2), (5,1) } Then
1 1 1 1 1 5
P ( A)       11
36 36 36 36 36 36 P( B) 
36
3. B = { at least one of the two dice records a 6 } 11 25
P ( B)  1  
36 36
Union of Events
• The event A  B is the union of events A and B and consists of the outcomes that are contained
within at least one of the events A and B. The probability of this event, , isPthe  B)
( Aprobability
that at least one of the events A and B occurs.
• Notice that the outcomes in the event A  Bcan be
classified into three kinds.
1. in event A but not in event B (only A)
2. in event B but not in event A or (only B)
3. in both events A and B

P ( A  B )  P ( A  B)  P ( A  B )  P ( A  B )
Union of Events
 P( A  B)  P( A)  P( A  B )
P( A  B )  P ( B )  P ( A  B )

 P( A  B)  P( A)  P ( B )  P ( A  B )

 If the events A and B are mutually exclusive so that


P ( A  B )  0, then P( A  B)  P ( A)  P( B )
Simple results of union of events

( A  B)  A  B
( A  B)  A  B
A B  B  A
A A  A
A S  S
A   A
A  A  S
A  ( B  C )  ( A  B)  C
Intersection of Events
The event A  B is the intersection of the events A and B and consists of the outcomes that are
contained within both events A and B. The probability of this event, P ( A  B,) is the probability that
both events A and B occur simultaneously.
 P ( A  B )  P ( A  B)  P ( A) For mutually exclusive events A∩B=0
P ( A  B )  P ( A  B )  P ( B )
Rules of Probability
There are special rules that will allow you to calculate probabilities for composite events.

The Additive Rule for Unions:


For any two events, A and B, the probability of their union, P(A U B), is

The above Rule can be extended to three events


Ex 2.1
Solved E.g 2.14
Ex 2.2
=0.75 ans
Conditional probability
Let A and B are any two events in a sample space S and P(B) ≠ 0
The probability that A occurs, given that event B has occurred is called
the conditional probability of A given B and is defined as

Also note that if P(A) ≠ 0. Needless to mention that P(A∩B) = P( B∩


A)
Conditional Probability........
 A B   This means that if two events are disjoint or
P( A  B) 0 mutually exclusive, then conditional probability
P( A | B)   0 that A occurs given B has already occurred is 0
P( B) P( B)
B A
A B  B
P( A  B) P( B) If B is a subset of A or if B is contained in A,
P( A | B)   1 then P(A/B)= 1
P( B) P( B) However if A is a subset of B then P(A/B)=
provided P(B)≠ 0

Important : Concept of Mutual Exclusiveness is different from independence.


Mutual Exclusive implies if an event occurs it prevents the occurrence of another
event in the same trial. However Independence means the two event can occur
together, but the happening of one does not influence the probability of the
occurrence of other
Examples of Conditional Probability
• A fair die is rolled.
1
P (6) 
6
P (6  even) P(6)
P (6 | even)  
P (even) P (even)
P (6) 1/ 6 1
  
P (2)  P(4)  P(6) 1/ 6  1/ 6  1/ 6 3

• A red die and a blue die are thrown.


A = { the red die scores a 6 } 6 1 11
P( A)   and P( B) 
B = { at least one 6 is obtained on the two dice } 36 6 36
P( A  B)
P( A | B) 
P( B)
Here event A is a subset of B P( A)

P( B)
1/ 6 6
 
11/ 36 11
Multiplication Law of Probability

P( A  B)
P( A | B) 
P( B)
P( A  B)
P ( B | A) 
P ( A) For any two events, A and B, the probability that both A and B occur is
 P ( A  B )  P ( B ) P ( A | B )  P ( A) P ( B | A)
P( A  B  C )
 P (C | A  B ) 
P( A  B)
 P ( A  B  C )  P ( A  B ) P (C | A  B ))  P ( A) P ( B | A) P (C | A  B )
Example of multiplication law
Example : Car Warranties

A company sells a certain type of car, which it assembles in one of four


possible locations. Plant I supplies 20%; plant II, 24%; plant III, 25%; and
plant IV, 31%. A customer buying a car does not know where the car has
been assembled, and so the probabilities of a purchased car being from
each of the four plants can be thought of as being 0.20, 0.24, 0.25, and
0.31.
Each new car sold carries a 1-year bumper-to-bumper warranty.
P( claim | plant I ) = 0.05, P( claim | plant II ) = 0.11
P( claim | plant III ) = 0.03, P( claim | plant IV ) = 0.08
For example, a car assembled in plant I has a probability of 0.05 of
receiving a claim on its warranty.
Notice that claims are clearly not independent of assembly location
because these four conditional probabilities are unequal
Example of Car Warranties continued…….
If A1 , A2 , A3 , and A4 are, respectively, the events that a car is assembled in plants I, II, III,
and IV, then they provide a partition of the sample space, and the probabilities P (are
Ai ) the
supply proportions of the four plants.

B = { a claim is made }
= the claim rates for the four individual plants

P ( B )  P ( A1 ) P ( B | A1 )  P ( A2 ) P ( B | A2 )  P ( A3 ) P ( B | A3 )  P ( A4 ) P ( B | A4 )
 (0.20  0.05)  (0.24  0.11)  (0.25  0.03)  (0.31 0.08)
 0.0687
 P ( Ai ) and P ( B | Ai )  P ( Ai | B )  ?
 P ( A1 ), , P ( An ) : the prior probabilities
 P ( A1 | B ), , P ( An | B ) : the posterior probabilities
P ( Ai  B ) P ( Ai ) P ( B | Ai ) P ( Ai ) P ( B | Ai )
 P ( Ai | B )    n
P( B) P( B)
 P(A
j 1
j ) P( B | A j )
Bayes’ Theorem : law of Posterior Probabilities
Let a sample space be defined as comprising of n mutually exclusive events, A1, A2,
A3….An
Say for example throwing of a fair die, then the sample space is defined as below. Also B
is any event defined in the sample space
S  {1, 2,3, 4,5, 6}
 S  A1    An and Ai : mutually exclusive
 B  ( A1  B )    ( An  B ) and ( Ai  B ) : mutually exclusive
 P ( B )  P ( A1  B )    P ( An  B )
 P ( A1 ) P ( B | A1 )    P ( An ) P ( B | An )
Bayes’ Theorem

If A1 , A 2 ,..., An is a partition of a sample space, then the posterior


probabilities of the event Ai conditional on an event B can be obtained from
the probabilities Pand
( Ai ) Pusing
( B | Ai )the formula

P ( Ai ) P ( B | Ai )
P( Ai | B)  n

 P(A ) P( B | A )
j 1
j j

The prior probabilities P( plant I )  0.20, P( plant II )  0.24


P( plant III )  0.25, P( plant IV )  0.31
If a claim is made on the warranty of the car, how does this change these probabilities?
P ( plant I ) P (claim | plant I ) 0.20  0.05
P ( plant I | claim)    0.146
P (claim) 0.0687
P ( plant II ) P (claim | plant II ) 0.24  0.11
P ( plant II | claim)    0.384
P (claim) 0.0687
P ( plant III ) P (claim | plant III ) 0.25  0.03
P ( plant III | claim)    0.109
P (claim) 0.0687
P ( plant IV ) P (claim | plant IV ) 0.31 0.08
P ( plant IV | claim)    0.361
P (claim) 0.0687

No claim is made on the warranty


Ex 2.4
Q 58. Show that for any three events A,B and C with P(C)>0. P(AUB/C) =P(A/C)
+ P(B/C) – P( A∩ B/C).
1 defective
and 9 not
defective. O
defect in
sample means
both come
from 9 non
defective
components
Independence of Events
Independent Events
• Two events A and B are said to be independent events if one of the following holds:
P( A | B)  P( A), P( B | A)  P( B), and P( A  B)  P( A) P( B)
Any one of these conditions implies the other two.

• The interpretation of two events being independent is that knowledge about one event
does not affect the probability of the other event.

• Intersections of Independent Events A1 , A 2 ,..., An


The probability of the intersection of a series of independent events
is simply given by

P( A1    An )  P( A1 ) P ( A2 ) P ( An )
More Examples
• A fair die is thrown, event A is roll out of an even number and B is appearance of a high score
defined as a number greater than 3
P(A) = even number = { 2,4,6 } and P(B) = high score = { 4,5,6 }
Intuitively, these two events are not independent.
1 2
P(even)  and P(even | high score) 
2 3
= 3/6 . 2/3 = 1/3

P(A). P(B) = ½ . ½ = ¼. As P(A∩B) ≠ P(A) .P(B). These events are not independent

• A red die and a blue die are rolled.


A = { the red die has an even score } P(A) = 1/2
B = { the blue die has an even score } P(B) =1/2
The probability that both the red and blue die have an even score =9/36 or 1/4
1 1 1
P( A  B )  P ( A) P ( B )   
2 2 4 The events are independent.
Ex 2.5

You might also like