0% found this document useful (0 votes)
38 views

Lecture13 RandProcess1

This document discusses key concepts in probability theory and random processes including: (1) the definition of probability as the number of times an event occurs over the total number of experiments, (2) mutually exclusive and joint probabilities of events, (3) conditional probability and Bayes' rule, and (4) definitions of random variables including cumulative distribution functions and probability density functions.

Uploaded by

Nat5o3
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views

Lecture13 RandProcess1

This document discusses key concepts in probability theory and random processes including: (1) the definition of probability as the number of times an event occurs over the total number of experiments, (2) mutually exclusive and joint probabilities of events, (3) conditional probability and Bayes' rule, and (4) definitions of random variables including cumulative distribution functions and probability density functions.

Uploaded by

Nat5o3
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

' $

Basics of Probability Theory and Random


Processes
a
Basics of probability theory
• Probability of an event E represented by P (E) and is given by

NE
P (E) = (1)
NS
where, NS is the number of times the experiment is performed
and NE is number of times the event E occured.
Equation 1 is only an approximation. For this to represent the
exact probability NS → ∞.
The above estimate is therefore referred to as Relative
Probability
Clearly, 0 ≤ P (E) ≤ 1.

& %
a required for understanding communication systems
' $

• Mutually Exclusive Events


Let S be the sample space having N events E1 , E2 , E3 , · · · , EN .
Two events are said to be mutually exclusive or statistically
[N
independent if Ai ∩ Aj = φ and Ai = S for all i and j.
i=1
• Joint Probability
Joint probability of two events A and B represented by
P (A ∩ B) and is defined as the probability of the occurence of
both the events A and B is given by

NA∩B
P (A ∩ B) =
NS

• Conditional Probability

& %
Conditional probability of two events A and B represented as
' $

P (A|B) and defined as the probability of the occurence of


event A after the occurence of B.

NA∩B
P (A|B) =
NB
Similarly,

NA∩B
P (B|A) =
NB
This implies,

P (B|A)P (A) = P (A|B)P (B) = P (A ∩ B)

& %
' $

• Chain Rule
Let us consider a chain of events A1 , A2 , A3 , · · · , AN which are
dependent on each other. Then the probability of occurence of
the sequence

P (AN , AN −1 , AN −2 ,
· · · , A 2 , A1 )
= P (AN |AN −1 , AN −2 , · · · , A1 ).
P (AN −1 |AN −2 , AN −3 , · · · , A1 ).
· · · .P (A2 |A1 ).P (A1 )

& %
' $

Bayes Rule

A1
A
2
A5 B

A4
A3

Figure 1: The partition space

In the above figure, if A1 , A2 , A3 , A4 , A5 partition the sample space


S, then (A1 ∩ B), (A2 ∩ B), (A3 ∩ B), (A4 ∩ B), and (A5 ∩ B)
partition B.
Therefore,
& %
' $

X
n
P (B) = P (Ai ∩ B)
i=1
Xn
= P (B|Ai ).P (Ai )
i=1

In the example figure here, n = 5.

P (Ai ∩ B)
P (Ai |B) =
P (B)
P (B|Ai ).P (Ai )
=
X
n
P (B|Ai ).P (Ai )
i=1

& %
In the above equation, P (Ai |B) is called posterior probability,
' $

P (B|Ai ) is called likelihood, P (Ai ) is called prior probability and


Xn
P (B|Ai ).P (Ai ) is called evidence.
i=1

& %
' $

Random Variables
Random variable is a function whose domain is the sample space
and whose range is the set of real numbers Probabilistic description
of a random variable
• Cummulative Probability Distribution:
It is represented as FX (x) and defined as

FX (x) = P (X ≤ x)

If x1 < x2 , then FX (x1 ) < FX (x2 ) and 0 ≤ FX (x) ≤ 1.

• Probability Density Function:


It is represented as fX (x) and defined as
& %
' $

dFX (x)
fX (x) =
dx
This implies,

Z x2
P (x1 ≤ X ≤ x2 ) = fX (x) dx
x1
dFX (x)
fX (x) =
dx

& %

You might also like