0% found this document useful (0 votes)
10 views11 pages

Probability

The document provides an overview of probability concepts, including trials, events, equally likely events, exhaustive events, mutually exclusive events, and sample space. It explains the calculation of probability using favorable outcomes and total possible outcomes, along with examples involving dice and coins. Additionally, it covers theorems related to probability, conditional probability, and independence of events.

Uploaded by

zoyaf1542
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views11 pages

Probability

The document provides an overview of probability concepts, including trials, events, equally likely events, exhaustive events, mutually exclusive events, and sample space. It explains the calculation of probability using favorable outcomes and total possible outcomes, along with examples involving dice and coins. Additionally, it covers theorems related to probability, conditional probability, and independence of events.

Uploaded by

zoyaf1542
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Probability

Trial: Consider an experiment which though repeated under essentially identical conditions,
does not give unique results but may result in any one of the several possible outcomes. The
experiment is known as a trial.

In other words, we can say that Trial is an attempt to produce an outcome of a random exper-
iment. For example, Throwing of a dice is a trial or Tossing of a coin is a trial.

Event: The outcomes of experiments are called events. e.g. If I toss a unbiased coin then we
get head or tail event. Throwing a unbiased dice then we get {1, 2, 3, 4, 5, 6}.

Equally likely events: A number of events are said to be equally likely if, there is no reason
to expect any one of the events in preference to the others. In a tossing a unbiased coin, the
two possible outcomes Head and Tail are equally likely. Similarly, when we throw a die the
occurrence of the numbers 1 or 2 or 3 or 4 or 5 or 6 are equally likely events.

Exhaustive events: The set of all possible outcomes in a trial constitutes the set of exhaustive
cases. In other words the totality of all possible outcomes of a random experiment will form
the exhaustive cases. For example, in the case of tossing a coin there are two exhaustive cases
head or tail. In throwing a die there are six exhaustive cases since any one of the six faces
{1, 2, 3, 4, 5, 6} may come upper most.

Mutually exclusive events: Events are said to be mutually exclusive when only one of the
events can occur and two(or more) events can not appear simultaneously. In the case of tossing
of a coin Head and Tail are mutually exclusive events, because if head turns up, tail can not
happen and vice versa i.e. Head and Tail can not occur simultaneously.

Favourable cases: The cases which entail the occurrence of an event are said to be favourable
to the events. For example, while throwing a die, the occurrence of 2 or 4 or 6 are the favourable
events which entail the occurrence of an even number.

Sample Space: A sample space is the set of all conceivable outcome of a random experiment.
The sample space is usually denoted by S.

Probability: Probability may be defined as the probable chance of occurrence with which a
defined is expected to occur out of the total possible occurrences. It is the relative frequency of
the number of occurrences of a favorable event to the total number of occurrences of all possible
events. If an event can occur in N mutually exclusive and equally likely ways and if n of them
possess a specific characteristic E, then, P (E) = Nn .

Or we can say that p = Number of favourable outcomes .


Number of possible outcomes
If n = number of favourable outcomes and N = number of possible outcomes then symboli-
cally, P (E) = Nn . for ex., if a surgeon perform renal transplants in 150 cases and succeeds in 63
cases then probability of survival after operation is calculated as:
Number of survival after operation 63
p= = 150 = 0.42 and chance of not surviving i.e., prob-
Total number of cases operated
ability of dying cases, q or 1 − p is 1 − 0.42 = 0.58 which can be shown as:
No. of cases died after the operation 87
q= = = 0.58.
Total number of cases operated 150

In a unbiased dice, the probability of getting each of 1, 2, 3, 4, 5 or 6 in one throw of dice will
be 61 .

Q.: A die is rolled, find the probability that an even number is obtained.

S.: Let us first write the sample space S of the experiment S = {1, 2, 3, 4, 5, 6}

Let E be the event “an even number is obtained” and write it down E = {2, 4, 6}

n
We now use the formula of the probability P (E) = N
= 36

Q.: Two dice are rolled, find the probability that the sum is (a) equal to 1 (a) equal to 4
(c) less than 13.

S.: The sample space S of two dice is shown below.

S = {(1,1),(1,2),(1,3),(1,4),(1,5),(1,6),(2,1),(2,2),(2,3),(2,4),(2,5),(2,6),(3,1), (3,2),(3,3),(3,4),(3,5),
(3,6),(4,1),(4,2),(4,3),(4,4),(4,5),(4,6),(5,1),(5,2),(5,3), (5,4),(5,5),(5,6),(6,1),(6,2),(6,3),(6,4),(6,5),(6,6)}
(a) Let E be the event “sum equal to 1”. There are no outcomes which correspond to a sum
0
equal to 1, hence P (E) = 36

3
(b) Three possible outcomes give a sum equal to 4, E = {(1, 3), (2, 2), (3, 1)}, hence P (E) = 36

36
(c) All possible outcomes, E = S, give a sum less than 13, hence P (E) = 36

Q.: A die is rolled and a coin is tossed, find the probability that the die shows an odd number
and the coin shows a head.

S.: The sample space S is shown below.

S = {(1,H),(2,H),(3,H),(4,H),(5,H),(6,H),(1,T),(2,T),(3,T),(4,T),(5,T),(6,T)}

Let E be the event “the die shows an odd number and the coin shows a head”.

Event E may be described as follows E = {(1, H), (3, H), (5, H)}.

3
The probability P (E) is given by P (E) = 12
.

Q.: The blood groups of 200 people is distributed as follows: 50 have type A blood, 65 have
B blood type, 70 have O blood type and 15 have type AB blood. If a person from this group is
selected at random, what is the probability that this person has O blood type?
Frequency for O blood 7
S.: P (E) = = .
Total frequencies 200

Q.: A jar contains 3 red marbles, 7 green marbles and 10 white marbles. If a marble is drawn
from the jar at random, what is the probability that this marble is white?
Frequency for white color 10
S.: P (E) = = .
Total frequencies 20

1. A die is rolled, find the probability that the number obtained is greater than 4.

2. Two coins are tossed, find the probability that one head only is obtained.

3. Two dice are rolled, find the probability that the sum is equal to 5.

4. A card is drawn at random from a deck of cards. Find the probability of getting the King
of heart.

5. A card is drawn at random from a deck of cards. Find the probability of getting a queen.

Some Elementary Definitions and Theorems

1. 0 ≤ P (A) ≤ 1
where P (A) is the probability of observing event A. The probability of any event or exper-
imental outcome, P (A), can not be less than 0 or greater than 1. An impossible event has a
probability of 0. A certain event has a probability of 1.

2. If events A and B are mutually exclusive the probability of observing A or B is the sum of
the probability of each events, A, B.

P (A ∪ B) = P (A) + P (B).

3. P (A) + P (B) = 1, where A and B are mutually exclusive and equally likely. we can say that
sum of probability of all events is equal to 1.

4. If two events are not mutually exclusive then P (A ∪ B) = P (A) + P (B) − P (A ∩ B).

5. If A is any event then Ā is its complement, so P (Ā) = 1 − P (A).

6. If events A and B are mutually exclusive, then P (A ∩ B) = 0.

Theorem: Probability of the impossible event is zero, i.e. P (φ) = 0.

Proof: Impossible event contains no sample point and hence the certain event S and the im-
possible event φ are mutually exclusive. Hence

S ∪ Φ = S ⇒ P (S ∪ Φ) = P (S) ⇒ P (S) + P (φ) = P (S) ⇒ P (φ) =0.

Theorem: Probability of the complementary event Ā of A is given by P (Ā) = 1 − P (A).

Proof: A and Ā are disjoint events. A ∪ Ā = S

P (A ∪ Ā) = P (S) ⇒ P (A) + P (Ā) = 1 ⇒ P (A)=1-P (Ā).

Theorem: For any two event A ans B, P (Ā ∩ B) = P (B) − P (A ∩ B).

Proof: (Ā ∩ B) and (A ∩ B) are disjoint events. (A ∩ B) ∪ (Ā ∩ B) = B

P (Ā ∩ B) + P (A ∩ B)= P (B) ⇒ P (Ā ∩ B) = P (B) - P (A ∩ B).

Ex. Let A be the event that a person has normotensive diastolic blood pressure(DBP) reading
(DBP < 90) and B be the event that a person has borderline DBP readings (90 ≤ DBP < 95).
Suppose that P (A) = 0.7 and P (B) = 0.1. Let Z be the event that a person has a (DBP < 95).

Sol. P (Z) = P (A) + P (B) = 0.8 because the events A and B mutually exclusive events. if we
defined another event C, C be the event (DBP ≥ 90). Then C = Ā because C can only occur
when A does not occur. P (C) = P (Ā) = 1 − P (A) = 1 − 0.7 = 0.3.
Theorem: Probability of the union of any two events A and B is given by P (A ∪ B) =
P (A) + P (B) − P (A ∩ B).

Proof: P (A ∪ B) = P (A ∪ (B ∩ Ā)) = P (A) + P (B ∩ Ā)

so P (A ∪ B) = P (A) + P (B) − P (A ∩ B).

If A, B, C are any three event, P (A ∪ B ∪ C) = P (A) + P (B) + P (C) − P (A ∩ B) − P (A ∩


B) − P (B ∩ C) + P (A ∩ B ∩ C).

Ex.: Given P (A) = 0.30 and P (B) = 0.78 and P (A ∩ B) = 0.16. Find (i) P (Ā ∩ B̄) (ii)
P (Ā ∪ B̄) (iii) P (A ∩ B̄)

Sol.: (i) P (Ā ∩ B̄) = P {(A ∪ B)} = 1 − P (A ∪ B) = 1 − {P (A) + P (B) − P (A ∩ B)} =


1 − (0.30 + 0.78 − 0.16) = 0.08.

(ii) P (Ā ∪ B̄) = P {(A ∩ B)} = 1 − P (A ∩ B) = 1 − 0.16 = 0.84.

(iii) P (A ∩ B̄) = P {A − (A ∩ B)} = P (A)-P (A ∩ B) = (0.30 − 0.16) = 0.14.

Ex.: The probability that a student passes statistics test is 2/3 and the probability that he
passes both statistics and Mathematics test is 14/45. The probability that he passes at least
one test is 4/5. What is the probability that he passes Mathematics test?

Sol.: Define, A the student passes statistics test. B the student passes Mathematics test. Given

P (A) = 2/3, P (A ∩ B) = 14/45, P (A ∪ B) = 4/5 so find P (B) =?.

4
By addition theorem, P (A∪B) = P (A)+P (B)−P (A∩B) = 5
= 23 +P (B)− 14
45
= P (B) = 70
135
.

Ex. Let A be the event that a person has normotensive diastolic blood pressure(DBP) reading
(DBP¡90) and B be the event that a persson has borderline DBP readings (90 ≤ DBP < 95).
Suppose that P (A) = 0.7 and P (B) = 0.1. Let Z be the event that a person has a (DBP¡95).

Sol. P (Z) = P (A) + P (B) = 0.8 because the events A and B mutually exclusive events. if we
defined another event C, C be the event (DBP ≥ 90). Then C = Ā because C can only occur
when A does not occur. P (C) = P (Ā) = 1 − P (A) = 1 − 0.7 = 0.3.
CONDITIONAL PROBABILITY

Let A and B be any two events. The probability of the event A given that the event B has
already occurred or the conditional probability of A given B, denoted by P (A|B) is defined as
P (A ∩ B)
P (A|B) = , P (B) 6= 0. Similarly the conditional probability of B given A is defined
P (B)
P (A ∩ B)
as P (B|A) = , P (A) 6= 0.
P (A)
Multiplication law of probability For any two events A and B,

P (A ∩ B) = P (A)P (B|A); P (A) > 0.


= P (B)P (A|B); P (B) > 0.

where P (A|B) and P (B|A) are the conditional probabilities of A and B respectively.

Independence of two events A and B

An event A is said to be independent (statistically independent) of event B, if the conditional


probability of A given B, i.e., P (A|B) is equal to the unconditional probability of A. In symbols,
P (A|B) = P (A).

Similarly if the event B is independent of A, we must have P (B|A) = P (B).

Since P (A ∩ B) = P (A)P (B|A) and since P (B|A) = P (B) when B is independent of A, we


must have, P (A ∩ B) = P (A) P (B).

Hence, the events A and B are independent if P (A ∩ B) = P (A)P (B).

A set of events A1 , A2 , ..., An are said to be mutually independent if P (A1 ∩ A2 ∩ .... ∩ An ) =


P (A1 )P (A2 )....P (An ) for every subset (A1 , A2 , ..., An ) of A1 , A2 , ..., An .

For example, three events A, B and C are said to be mutually independent if P (A ∩ B) =


P (A)P (B), P (B∩C) = P (B)P (C), P (A∩C) = P (A)P (C) and P (A∩B∩C) = P (A)P (B)P (C).

Theorem If A and B are two independent events then (i) A and B̄ are independent (ii) Ā
and B are independent (iii) Ā and B̄ are independent.

Poof Since A and B are two independent events then P (A ∩ B) = P (A)P (B),

(i) P (A ∩ B̄) = P (A)P (B̄|A) = P (A){1 − P (B|A)} = P (A){1 − P (B)} = P (A)P (B̄) i.e. A
and B̄ are independent.

(ii) P (Ā ∩ B) = P (B)P (Ā|B) = P (B){1 − P (A|B)} = P (B){1 − P (A)} = P (B)P (Ā) i.e.
B and Ā are independent.
(iii) P (Ā ∩ B̄) = P (A ∪ B) = 1 − P (A ∪ B) = 1 − {P (A) + P (B) − P (A ∩ B)} = 1 − P (A) −
P (B) + P (A ∩ B) since P (A ∩ B) = P (A)P (B).

= (1 − P (A)) − P (B)(1 − P (A)) = (1 − P (A))(1 − P (B)) = P (Ā)P (B̄)

Bayes Theorem

Theorem If E1 , E2 , ...., En are mutually disjoint events with P (Ei ) 6= 0, (i = 1, 2, ..., n) then
for any orbitrary event A which is a subset of ∪ni=1 Ei such that P (A) > 0, we have
P (Ei )P (A|Ei )
P (Ei |A) = n , i = 1, 2, ...n
X
P (Ei )P (A|Ei )
i=1

Proof: Since A ⊂ ∪ni=1 Ei , we have A = A ∩ (∪ni=1 Ei ) = ∪ni=1 (A ∩ Ei )

Since (A ∩ Ei ) ⊂ Ei ; (i = 1, 2, ..., n) are mutually disjoint events, we have by addition theorem


of probability
n
X n
X
P (A) = P [∩(∪ni=1 Ei )] = P (A ∩ Ei ) = P (Ei )P (A|Ei )
i=1 i=1

by compound theorem of probability.

Also we have P (A ∩ Ei ) = P (A)P (Ei |A)

P (A ∩ Ei ) P (Ei )P (A|Ei )
P (Ei |A) = = n
P (A) X
P (Ei )P (A|Ei )
i=1
Ex. Let A and B be two events associated with an experiment and suppose P (A) = 0.5
while P (A ∪ B) = 0.8. For what value of P (B) are (i) A and B mutually exclusive (ii) A and
B independent.

Sol. Given P (A) = 0.5, P (A ∪ B) = 0.8.

(i) A and B mutually exclusive then P (A ∪ B) = P (A) + P (B)


0.8 = 0.5 + P (B) ⇒ P (B) = 0.3.

(ii) A and B independent then

P (A ∪ B) = P (A) + P (B) − P (A ∩ B) = P (A) + P (B) − P (A)P (B)


0.8 = 0.5 + P (B) − 0.5 ∗ P (B) ⇒ P (B) = 35 .

Ex. If A and B be two events such that P (A) = 1/3, P (B) = 1/4 and P (A ∩ B) = 1/8.
Find P (A|B) and P (A|B).
Sol. Given P (A) = 1/3, P (B) = 1/4, P (A ∩ B) = 1/8.

P (A ∩ B) 1/8
P (A|B) = = = 1/2 = 0.5.
P (A) 1/4
P (A ∩ B) P (A) − P (A ∩ B) 1/3 − 1/8
P (A|B) = = = = 5/18.
P (B) 1 − P (B) 1 − 1/4
Ex. A husband and wife appear in an interview for two vacancies in a firm. The probability
of husbands selection is 1/7 and that of wife’s selection is 1/5. What is the probability that

(i) both of them will be selected. (ii) only one of them will be selected. (iii) none of them
will be selected.

Sol. Let us define the events as, A : − The husband get selected and B : − The wife get
selected. Given P (A) = 1/7, P (B) = 1/5, P (A) = (1 − 1/7) = 6/7, P (B) = (1 − 1/5) = 4/5.

(i) P(both of them will be selected) = P (A ∩ B) = P (A) P (B) since A and B are indepen-
dent.
P (A ∩ B) = (1/7) ∗ (1/5) = 1/35

(ii) P(only one of them will be selected) = P (A ∩ B) ∪ P (A ∩ B) = P (A ∩ B) + P (A ∩ B) =


P (A)P (B) + P (A)P (B) = (1/7) ∗ (4/5) + (6/7) ∗ (1/5) = 10/35.

(iii) P(none of them will be selected) = P (A ∩ B) = P (A)P (B) = (6/7) ∗ (4/5) = 24/35.

Ex. A problem in statistics is given to 3 students A, B and C whose chances of solving it


are 1/2, 3/4 and 1/4 respectively. What is the probability that the problem will be solved?

Sol. Let us define the event as: A : the problem is solved by the student A; B : the problem
is solved by the student B; C : the problem is solved by the student C; P (A) = 1/2, P (B) =
3/4, P (C) = 1/4,

The problem will be solved if at least one of them solves the problem. That means we have
to find P (A ∪ B ∪ C)

P (A ∪ B ∪ C)=P (A) + P (B) + P (C) − P (A ∩ B) − P (B ∩ C) − P (A ∩ C) + P (A ∩ B ∩ C)


= P (A) + P (B) + P (C) − P (A)P (B) − P (B)P (C) − P (A)P (C) + P (A)P (B)P (C)

= (1/2)+(3/4)+(1/4)−(1/2)∗(3/4)−(1/4)∗(3/4)−(1/2)+(1/4)+(1/2)∗(3/4)∗(1/4) = 29/32.

Ex. There are two bags I and II. Bag I contains 3 white and 4 black balls and Bag II
contains 5 white and 6 black balls. If one ball is selected at random from one of the bags and
is found to be white. Find the probability that it was drawn from bag I?
1
Sol. Let E1 and E2 be the events of selecting bag I and bag II respectively. Then P (E1 ) = 2
and P (E2 ) = 12 .

Let W be the event of drawing a white ball and B be the event of drawing a black ball.

3
Then P (W |E1 ) = Probability of selecting a white ball from Bag I = 7

5
Then P (W |E2 ) = Probability of selecting a white ball from Bag II = 11

P (E1 ).P (W |E1 )


By Baye’s Theorem, P (E1 |W ) =
P (E1 ).P (W |E1 ) + P (E2 ).P (W |E2 )

(1/2) ∗ (3/7) 33
= = .
(1/2) ∗ (3/7) + (1/2) ∗ (5/11) 68

Ex. Two urns I and II contain respectively 3 white and 2 black bails, 2 white and 4 black
balls. One ball is transferred from urn I to urn II and then one is drawn from the latter. It
happens to be white. What is the probability that the transferred ball was white.

Sol. Define, B1 - Transfer a white ball from Urn I to Urn II, B2 - Transfer a black ball
from Urn I to Urn II., A- Select a white ball from Urn II. Here P (B1 ) = 3/5, P (B2 ) = 2/5,
P (A|B1 ) = 3/7, P (A|B2 ) = 2/7 and P (B1 |A) =?.

P (B1 ).P (A|B1 )


By Baye’s Theorem, P (B1 |A) =
P (B1 ).P (A|B1 ) + P (B2 ).P (A|B2 )

(3/5) ∗ (3/7)
= = 9/13.
(3/5) ∗ (3/7) + (2/5) ∗ (2/7)
RANDOM VARIABLE AND PROBABILITY
DISTRIBUTIONS
Random Variable: A random variable (r.v.) is a real valued function defined over the sample
space. So its domain of definition is the sample space S and range is the real line extending
from −∞ to +∞.

In other words a r.v. is a mapping from sample space to real numbers.

Random variables are also called chance variables or stochastic variables. It is denoted by X
or X(ω).

In symbols, X : S → R(−∞, +∞)

Random Variables are of two types (i) Discrete (ii) Continuous.

A random variable X is said to be discrete if its range includes finite number of values or count-
ably infinite number of values. The possible values of a discrete random variable can be labelled
as xl , x2 , x3 ... eg. the number of defective articles produced in a factory in a day in a city,
number of deaths due to road accidents in a day in a city, number of patients arriving at a
doctors clinic in a day etc.

A random variable which is not discrete is said to be continuous. That means it can assume
infinite number of values from a specified interval of the form [a, b].

A few examples of continuous random variable are given below


(1) X represents the height of a student randomly chosen from a college, (2) life time of a tube
etc.

Note that r.v.s. are denoted by capital letters X, Y, Z etc. and the corresponding small letters
are used to denote the value of a.r.v.

Probability Mass Function Suppose X is a one-dimensional discrete random variable tak-


ing at most a countably infinite number of values x1 , x2 , .... with each possible outcome xi , we
associate a number pi = P (X = xi ) = p(xi ), called the probability of xi . The numbers p(xi );
i = 1, 2, .... must satisfy the following conditions:

X
(i) p(xi ) ≥ 0 ∀i, (ii) p(xi ) = 1
i=1

This function p is called the probability mass function of the random variable X and the set
(Xi , p(xi )) is called the probability distribution (p.d.) of the r.v. X.

Probability Density Function If X is a continuous random variable. Consider the small


interval (x, x + dx) of length dx round the point x. Let f (x) be any continuous function of x
so that f (x)dx represents the probability that X falls in the in finite small interval (x, x + dx).
Symbolically, P (x ≤ X ≤ x + dx) = f (x)dx, then f (x) is called probability density function
Z ∞
(pdf) of a continuous r.v. provided it satisfy the conditions (i) f (x) ≥ 0, (ii) f (x) = 1
−∞

Distribution function For any random variable X, the function of the real variable x defined
as Fx (x) = P (X ≤ x) is called cumulative probability distribution function or simply cumula-
tive distribution function (cdf)of X. We can note that the probability distribution of a random
variable X is determined by its distribution function.

If X is a discrete r.v. with pmf p(x), then the cumulative distribution function is defined as
x
X
Fx (x) = P (X ≤ x) = p(x).
−∞

If X is a continuous r.v. with pdf f (x), then the cumulative distribution function is defined as
Z x
Fx (x) = P (X ≤ x) = f (x)dx.
−∞

Properties of distribution function If F (x) is the distribution function of a r.v. X then it


has the following properties.

• F (x) is defined for all real values of x.

• F (−∞) = 0 and F (∞) = 1.

• 0 ≤ F (x) ≤ 1.

• F (x) is a continuous function of x on the right.

Ex. Obtain the probability distribution of the number of heads when three coins are tossed
together?

Sol. When three coins are tossed, the sample space is given by S = HHH, HHT, HT H, T HH, HT T, T HT, T
Here the r.v. X defined as the number of heads obtained will takes the values 0, l, 2and3 from the
real line w.r.t each outcome in S. We can assign probabilities to each value of the r.v. as follows.

PX=0

You might also like