0% found this document useful (0 votes)
1 views

Probability_Introduction_SgM

Very nice book

Uploaded by

goswamisoumya27
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

Probability_Introduction_SgM

Very nice book

Uploaded by

goswamisoumya27
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 23

Elements of

Probability
TEXT BOOKS:
1. “Probability, Random Variables and Random Signal
Principles” by Peyton Peebles.

2. “A First Course in Probability” by Sheldon


Ross.

3. “Probability - Random Variables and Stochastic


Processes” by Athanasios Papoulis & S Pillai.

4. “Probability, Random Variables and Random Processes”


(Schaum’s Outline Series) by Hwei Hsu.
 MOTIVATION:

 The term PROBABILITY refers to the study of randomness


and uncertainty.
 In any situation in which one of a number of possible o/c s
may occur, the discipline of probability provides methods
for quantifying the chances, or likelihoods, associated with
the various o/cs.

 The language of probability is constantly used in an


informal manner in both written & spoken contexts.

 Examples include such statements as—

“It is likely that the paytm share value will increase by the end of
the year,”
“There is a 50–50 chance that the incumbent will seek re-
election,”
“There will probably be at least one section of that course offered
next year,”
“The odds favor a quick settlement of the strike,”
and
“It is expected that at least 20,000 concert tickets will be sold.”

Certain, Impossible and Random Events:

Proposition : In every realization of a set of conditions Ω, there occurs


an event A.
Ex—If water at atm pressure (760 mm of Hg) is heated above 100°C
(the set of condns Ω), it is transformed into steam (event A).
Or
For any chemical reaction of substances, w/o exchange with the
surrounding medium (the set of condns Ω), the total quantity of substance
(matter) remains unchanged (event A)law of conservation of matter..
An event A that unavoidably occurs for every realization of a set of
conditions Ω, is called certain (sure).
If the event A definitely can’t occur for every realization of a set of
conditions Ω, it is called impossible.

If the event A may or may not occur for every realization of a set of
conditions Ω, it is called random.
Example:
A bag contains 5 red balls & 3 green balls. If a ball is
chosen at random from the bag. then which of the
following is an impossible event?
 {Picking a red ball or a green ball} is a certain
event.

 {Picking a green ball} is a random event.

{Picking a red ball} is a random event

 {Picking a yellow ball} is an impossible event.


Conclusion:
Any event can be considered as certain, impossible or random only with
respect to some definite set of conditions.

Mere assertion of the randomness of any event A is of very restricted


interest; it simply amounts to stating that the set of conditions Ω does not
reflect the entire collection of reasons necessary and sufficient for the
occurrence of A.

Modeling random experiments:


A random experiment is any activity or process whose o/c is
subjected to uncertainty.

Random experiment outcomes are non-deterministic  never possible


to predict outcome accurately , in spite of prior knowledge of outcomes
for large no. of performances.

Although the word experiment generally suggests a planned or


carefully controlled laboratory testing situation, we use it here in
a much wider sense.
Thus experiments that may be of interest include:

tossing a coin once or several times,

 selecting a card or cards from a deck,

ascertaining the commuting time from home to work on


a particular morning,
+
obtaining blood types from a group of individuals,

 measuring the compressive strengths of different steel


beams made to same specs.

 Observing the failure-free operation time of a system.


 Observing the repairing time of a system.
 Communication Systems:
 Communication systems play a central role in our lives. Everyday,
we use our cell phones, access the internet, use our TV remote controls,
and so on. Each of these systems relies on transferring information from
one place to another. For example, when you talk on the phone, what you
say is converted to a sequence of 0's and/or 1's called information bits.
These information bits are then transmitted by your cell phone antenna to
a nearby cell tower as shown in Figure.
Transmission of data from a cell phone to a cell tower.
 The problem that communication engineers must consider is that the
transmission is always affected by noise. That is, some of the bits received
at the cell tower are incorrect. For example, your cell phone may transmit
the sequence "010010..,""010010…," while the sequence "010110….,"
"010110…"might be received at the cell tower. In this case, the
4th bit is incorrect. Errors like this could affect the quality of the
audio in your phone conversation.

 The noise in the transmission is a random phenomenon.


Before sending the transmission we do not know which bits will
be affected. It is as if someone tosses a (biased) coin for each bit
and decides whether or not that bit will be received in error.
Probability theory is used extensively in the design of modern
communication systems in order to understand the behavior of
noise in these systems and take measures to correct the errors.
 PROBABILITY
1. Classical or a priori Definition:
The classical definition of probability reduces the concept of
probability to the notion of equal probability (likelihood) of
events which becomes a basic assumption. Further, the
probability P(A) of an event A is determined a priori, w/o
experimentation.
 Thus, if a random experiment has N number of exhaustive &
equally likely o/cs & NA of these are favourable to the event A,

the Pr of event A is computed a priori as .

 Thus, (1)

Thus
Example :
Throwing an unbiased die & observing the number on the face
that shows up.
S is the set of all possible o/cs, & is called sample space.
S= { x1, x2 , x3, x4 , x5, x6 }
Probability of each number x showing up is P(x) =1/6
That is, P(x1)=P(x2)=P(x3)=P(x4)=P(x5)=P(x6)=1/6
Example :
Throwing a pair of unbiased dice & observing sum of # faced
up.
S has 6 2 = 36 elements shown below.

 Each possible o/c corresponds to ∑ having integral values


from 2 to 12.
 Consider event of getting a “nine”.

 Table shows  event can be achieved in 4 ways. Hence Pr(9)=


4/ 36 = 1/9.
Defects of Classical definition:
(i) It is based on the feasibility of having exhaustive and equally
likely o/cs. Not feasible in many practical settings.
Restricted to coin-tossing, die throwing & other games of
chance.
Equally likely = equally probable.
(ii)Definition tries to define probability in terms of equal
probability. How do we know whether the probabilities
(likelihoods) are equal unless they are measured? The
definition is thus circular in nature.

(iii)The formula suggests that it is important to be able to count


elements in sets. If sets are small, this is an easy task; however, if the sets
are large, this could be a difficult job.

2. Probability As A Measure of Frequency of Occurrence


(Statistical definition of probability) :
Statistical Regularity:
In order to be useful, a model must enable us to
make predictions about the future behavior of a
phenomenon, and in order to be predictable, a
phenomenon must exhibit regularity in its
behavior. Many probability models in engg are
based on the fact that averages obtd in long
sequences of repetitions (trials) of random
experiments consistently yield approximately
the same value.

This property is called statistical regularity.

That such regularity is reasonable is based purely on the fact that


many investigators of numerous physical expts have observed
such regularity.
 Suppose: random expt. is performed n times.
 Certain event A occurs nA times.
 Relative frequency (ν) of occurrence of A is nA / n
(average number of successes)
 ν tends to stabilize and approach a number Pr (A) as n 
without limit.

 probability of occurrence of A is

 In practice n is large < ∞ .


 Quite clearly, since nA ≤ n, 0 ≤ P (A) ≤ 1.
 Example:

Probability of event {finding a defective PCB} ≈ 0.038

 In practice, n is large < ∞ .

 Quite clearly, nA ≤ n, 0 ≤ Pr (A) ≤ 1.


Probability of event {defective PCB} ≈ 0.038

Axiom # 1 : Pr (A) ≥ 0

Axiom # 2 : Pr (S) = 1

all encompassing event S should have highest


possible prob., which is 1.
Axiom # 3 : A1 , A2 , A3 , …….. be mutually

exclusive (disjoint) events defined on S.


Then,

Pr. of union of non-overlapping events, is Σ of


pr. of individual events.

The axiom can also be stated using following


alternative symbolic representation.

JOINT PROBABILITY
Notation:
1. The complement of an event A, denoted by Ac or A’ or Ā , is the set
of all outcomes in that are not contained in A. Thus AC may be called
an event contrary to event A.

2. The union of two events A and B, denoted by and read “A or B,” is


the event consisting of all outcomes that are either in A or in B
or in both events. So the union includes outcomes for which both A
and B occur as well as outcomes for which exactly one occurs—that
is, all outcomes in at least one of the events.

3. The intersection of two events A and B, denoted by and read “A and


B,” is the event consisting of all outcomes that are in both A and B.

 Since events have the properties of sets, when working with


events, intersection means "and", and union means "or". The
probability of intersection of A & B, P(A∩B), is sometimes shown
by P(A,B) or P(AB).

We know from a study of Venn diagram that

(4)
Equivalently,

(5)

 Example:
 Inspector checks assortment of 1000 resistors of different Ω
values & wattages, which are thoroughly mixed.

 No. of resistors with diff. Ω ratings & wattages are tabulated


below.

Resistance Values
Power 1Ω 10Ω 100Ω 1000 Total
Rating Ω
s
1W 50 300 90 0 440
2W 50 50 0 100 200
5W 0 150 60 150 360
Total 100 500 150 250 1000
Total no. of resistors n= 1000.
 Inspector pulls out 1 resistor at random from
assortment.
Let A= event of pulling a 10Ω resistance.
B = event of pulling a 5W resistance.

Obviously A & B are not mutually exclusive.


From table it is clear that the prob. of pulling out a
10Ω resistance is

Jt. probability of pulling out a 10Ω, 5W resistor is

Note:- Some of the jt probabilities are 0.


For example since this combination does
not exist.

Obviously for 2 mutually exclusive events, joint probability

CONDITIONAL PROBABILITY:

 Consider ratio  rel. freq. with which


AB occur, given A has occurred.

 Note :

(6)
 Empirical concept  suggests introduction of a

conditional probability measure


defined by,
(7a)

Prob. of occurrence of B, given that A


has occurred.
In a similar manner,

( 7b)
 In example on drawing out res. from the assortment
, consider cond. prob. of selecting a 10Ω res. when
it is already known that the chosen resistor is 5 W.
There are 360 nos. 5 W resistors, & 150 of them are
10 Ω, we have,
product of cond. Prob. & corresponding marginal
prob. is indeed joint prob.

STATISTICAL INDEPENDENCE:
Let 2 events A & B have nonzero probs. of
occurrence.

i.e.,

 A & B are statistically independent if prob. of


occurrence of 1 event is not affected by that of
other one. This is equiv. to requiring,

By (7a)
i.e. for s-independence of A and B,

Thus, s-independence also means that


jt. prob. of events = product of
marginal probabilities.

 Condn expressed by eq. (8) or (10) is not only


necessary but also sufficient for s- independence.

Recall ► jt. Prob. of 2 mutually exclusive events is


0. i.e.,

2 events having nonzero probs. can


not be both s-independent & mutually
exclusive.
 Hence for 2 events to be s-
independent, they must have an
intersection :-

OTHER RULES OF PROBABILITY:

1. Probability of the complement, i.e. of A not


occurring is denoted by

2. Iff events A & B are s-independent, then,

i.e., P(A) is unrelated to whether or not B


occurs & vice versa.

3. Prob. of any 1 of 2 s-independent events A &


B occurring is
For derivation of above eqn consider,

 Here, either A or B, or A & B must work


for the system to work.
 Let P(A) be prob of A working successfully
 P(B) be the prob of B working successfully
 System success prob. = PS , then failure
prob.= Pf = 1 ─ PS .
 Pf is jt. prob. of A & B failing, i.e.
4. If A & B are mutually exclusive , i.e. A & B
can not occur simultaneously, then,
P(AB) = 0 (15)
and
P(A+B) = P(A) + P(B) (15a)

5. If multiple, mutually exclusive probs. of

outcomes jointly give a prob. of occurrence


of A, then,
6. From eqn (7)

we have,

This is a simple form of Bayes’ theorem.


A more general form is,

CONCEPT OF RANDOM VARIABLE:


We often summarize the o/c from a random expt by a simple number. In
many of the examples of random experiments that we have considered, S
has been a description of possible o/cs. In some cases, descriptions of o/cs
are sufficient, but in other cases, it is useful to associate a number with each
o/c in S. Because the particular o/c of the expt is not known in advance,
the resulting value of our variable is not known in advance. For this reason,
the variable that associates a number with the o/c of a random expt is
referred to as a random variable.
A random variable (RV) is a function that assigns a real number to each o/c
in S of a random expt. Thus a relation is established betn random
o/cs & real nos.

A (RV) is usually represented by an uppercase letter such as X. After an


expt is conducted, the measured value of the random variable is denoted by
a lowercase letter such as x =70 mA. The values that X can take on are
represented by x.

A discrete RV is a random variable that can assume a countably finite (or


countably infinite) number of values over its range. Whenever the
measurement is limited to discrete points on the real line, the random variable
is said to be a discrete random variable.

Examples of discrete random variables: number of scratches on a


surface, proportion of defective parts among 1000 tested,
number of transmitted bits received in error.

A continuous RV is a RV that can assume a continuum of values in an


interval (either finite or infinite) of real numbers for its range.
Examples: electrical current, length, pressure, temperature, time, voltage,
and weight.

Examples:
1. Tossing of a coin:
S= { H, T }
If H → 1 and T → 0,
X(H)=1 & X(T)= 0.
RV X  1 or 0.
2. Rolling of A Single Die:
S= { 1,2,3,4,5,6 }
Let X(s) = s

You might also like