0% found this document useful (0 votes)
38 views44 pages

Slide 2 - 20191

This document summarizes key concepts in probability theory covered in Chapter 2. It defines probability and introduces important concepts such as random variables, probability density functions, moments, and random processes. Random variables represent outcomes as numbers and can be discrete, continuous, or mixed. Probability density functions describe the probabilities of random variables. Moments characterize distributions. Random processes generalize random variables over time and may be stationary or non-stationary. Linear time-invariant systems preserve properties of random process inputs.

Uploaded by

Minh Ngô
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views44 pages

Slide 2 - 20191

This document summarizes key concepts in probability theory covered in Chapter 2. It defines probability and introduces important concepts such as random variables, probability density functions, moments, and random processes. Random variables represent outcomes as numbers and can be discrete, continuous, or mixed. Probability density functions describe the probabilities of random variables. Moments characterize distributions. Random processes generalize random variables over time and may be stationary or non-stationary. Linear time-invariant systems preserve properties of random process inputs.

Uploaded by

Minh Ngô
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 44

Chapter 2: Review of Probability Theory

2.1. Probability
2.2. Random variable
2.3. Function of random variable
2.4. Discrete random vector
2.5. Moments of a Discrete Random Variable
2.6. Examples of Random variables
2.7. Sum of random variable
2.8. Random process
2.1. Probability

2.1.1. Probability definition


2.1.2. Conditional probability
2.1.3. Bayes rule
2.1.4.Total Probability theorem
2.1.5. Independence between events
2.1.1. Probability definition

• Use Measure Theory


• Based on a triplet
(Ω, F, P)
Where:
• Ω: sample space
• Set of all possible outcomes
• F: σ−algebra
• Set of all possible events or combinations of outcomes
• P: probability function
• Any set function
• Domain is Ω
• Range is the closed unit interval [0,1]
2.1.1. Probability definition (Cont.)

• P must obey the following rules:


• P(Ω) = 1
• Let A be any event in F, then P(A) ≥ 0
• Let A and B be two events in F such that A ∩ B
= ∅, then P(A ∪ B) = P(A) + P(B).
• Probability of complement: P(A) = 1 −P(A).
• P(A) ≤1.
• P(∅) = 0.
• P(A ∪ B) = P(A) + P(B) −P(A ∩B).
2.1.2. Conditional probability (Cont.)

• Let A and B be two events, with P(A) > 0. The conditional


probability of B given A is defined as:
P(A ∩B)
P(B|A) =
P(A)
• Hence, P(A ∩ B) = P(B|A)P(A) = P(A|B)P(B)
• If A ∩ B = ∅ then P(B|A) = 0
• If A ⊂ B , then P(B|A) = 1
2.1.3.Bayes rule

• If A and B are events:

P(B|A)P(A)
P(A|B) = P(B)
2.1.4.Total Probability Theorem
2.1.5. Independence between Events
2.1.5. Independence between Events (Cont.)
2.2. Random variable

2.1.1. What is random variables?


2.1.2. Cumulative Distribution Function (CDF)
2.2.3. Types of random variable
2.2.4. Probability Density Function (PDF)
2.2.1. What is random variables?

E.g:
2.2.1. What is random variables? (cont.)
2.2.1. What is random variables? (cont.)

• Random variable is a representation in numeric form of sample


space. Thus,
• We can process events with digital computing devices
• Values corresponding to the outcomes are sorted
2.2.1. What is random variables? (cont.)
2.2.1. What is random variables? (cont.)

Ex: Throw a dice with 6 faces


• Maps the sample space (1 dot, 2 dots…6 dots)
to numbers (1,2…6).
• The probability of each overcome is 1/6. The
probability of each number is also 1/6
• The subset (1 dot, 2 dots, 3 dots) maps to
interval from 1 to 3.
• The probability of the subset will be 3/6. The
probability of the interval will be also 3/6
2.2.2. Cumulative Distribution Function (CDF)

• CDF of a random variable X with a given value x (𝐹𝑋 (𝑥)) is


the probability for a random variable X has a value that
does not exceed x
𝐹𝑋 (𝑥)= P(X ≤ x)
• 𝐹𝑋 (𝑥) (∞) = 1
• 𝐹𝑋 (𝑥) (-∞) = 0
• If 𝑥1 < 𝑥2 , 𝐹𝑋 (𝑥2 ) ≥ 𝐹𝑋 (𝑥1 )
2.2.3. Types of random variable

• Discrete: Cumulative function is a step function


(sum of unit step functions)

𝐹𝑋 (𝑥)=σ𝑖 𝑃 (𝑋 = 𝑥𝑖 ) 𝑢 (𝑥 − 𝑥𝑖 )
where 𝑢(𝑥) is unit step function
2.2.3. Types of random variable (Cont.)
Example: X is the random variable that describes the outcome of
the roll of a die X ∈ {1, 2, 3, 4, 5, 6}

P(x)=1/6 for all x

Continous: Cumulative function is a continous function.


Mixed: Neither discrete nor continous.
2.2.4. Probability Density Function (PDF)
2.2.4. Probability Density Function (PDF)
(Cont.)
• Discrete random variable:
• 𝑃𝑋 (x) = P(X=x)
• The sum taken for every value of x of the probability
density function must be equal to 1
2.3. Function of random variable

• Random variable X = {x}, 𝑥𝑚𝑖𝑛 ≤ x ≤ 𝑥𝑚𝑖𝑛 with 𝐹𝑋 (x)


• Y = G(X) is defined as function of random variable
• Generate random variable Y ={y}, (𝑦𝑚𝑖𝑛 = G(𝑥𝑚𝑖𝑛 ) ≤ y ≤ 𝑦𝑚𝑎𝑥 = G(𝑥𝑚𝑎𝑥 ))
• Need to determine 𝐹𝑌 (y) from G(X) and 𝐹𝑋 (x)
• Example: Y = aX + b, a and b are constants, a > 0
2.3. Function of random variable (Cont.)

• Generalization:

Where 𝑥𝑖 is the solution of equation g(x) = y


• Coding operation is to map a source to a new source which is
suitable with transmission requirements
• E.g. source coding is to map an information source to a code source that
has uniform probability distribution
• From random variable function, a mapping can be found to transfer from
an arbitrary source to a source that has required probability distribution
2.4. Discrete Random vector

Let Z = [X , Y ] be a random vector with sample space Z= X×Y


X and Y are random variables have sample space

The joint probability distribution function of Z is mapping


2.4. Discrete Random vector (Cont.)

• Marginal Distributions:
2.4. Discrete Random vector (Cont.)

• Conditional Distributions:

𝑝𝑋𝑌 (𝑥,𝑦)
𝑝𝑋|𝑌=𝑦 (x) = 𝑝𝑌 (𝑦)

𝑝𝑋𝑌 (𝑥,𝑦)
𝑝𝑌|𝑋=𝑥 (y) = 𝑝𝑋 (𝑥)
2.4. Discrete Random vector (Cont.)

• Random variables X and Y are independent if and only if

• Consequences:
2.5. Moments of a Discrete Random Variable
2.5. Moments of a Discrete Random Variable
2.5. Moments of a Discrete Random Variable
2.6. Examples of Random variables

◼ Normal (Gaussian) random variables


 Density function
1
f X ( x) = e −( x −  ) / 2 2
2
.
2 2 f X (x )

 Bell shape curve x



 Distribution function
x 1 x−
FX ( x) =  e −( y −  ) / 2 2
dy = G
2
,
−
2 2   
x 1 − y2 / 2
G( x) =  − 2
e dy
2.6. Examples of Random variables (Cont.)

◼ Uniform random variables: X ~ U (a, b), a  b,


 Density function 

1
, a  x  b,
f X (x ) f X ( x) =  b − a
1 
 0, otherwise.
b−a
x
a b
◼ Exponential random variables X ~  ( )
 Density function
 1
 e − x /  , x  0,
f X (x )
f X ( x) =  

 0, otherwise. x
2.7. Sum of Gaussian random variable

• Gaussian independent random variables 𝑥1 , 𝑥2 ,..., 𝑥𝑛 have mean


𝑚1 , 𝑚2 ,… 𝑚𝑛 , variance 𝜎𝑥1 , 𝜎2 ,…,𝜎𝑥𝑛
• Sum of all variables y =σ𝑛1 𝑥𝑖 have
• Mean m = σ𝑛1 𝑚𝑖 : constant one-dimensional component
• Variance 𝜎 = σ𝑛1 𝜎𝑖 : Average power (alternating current power)
• E.g. A channel has input X, noise N, output Y = X+N
• Noise and Input: independent
• 𝑚𝑌 = 𝑚𝑋 + 𝑚𝑁
• 𝑃𝑌 = 𝑃𝑋 + 𝑃𝑁 with P(.) is average power
2.8. Random process

2.8.1 . Definition
2.8.2. Stationary random process
2.8.3. Statistical average or joint moments
2.8.4. Mean value or the first moment
2.8.5. Mean-squared value or the second moment
2.8.6. Correlation
2.8.7. Power spectral density of a random process
2.8.8. Time averaging and Egordicity
2.8.9. Random process and LTI system
2.8.1. Definition

• Random process is a set of potential random variables


(realization, member) to describe a physical object
• Example:
• To describe the movements of an animal, it is necessary to have
different cameras
• Sample space of each camera is a random variable
2.8.1. Definition (Cont.)
2.8.1 Definition (Cont.)

• To describe the random process, we need to have a set of values


• It is necessary to have the probability of appearing simultaneously
at one time on a time function of the realization
2.8.2. Stationary random process

• Based on whether its statistics change with time: non stationary or


stationary
• Different levels of stationary:
2.8.3. Statistical average or joint moments
2.8.4. Mean value or the first moment
2.8.5. Mean-squared value or the second
moment
2.8.6. Correlation
2.8.7. Power spectral density of a random
process
• Taking the Fourier transform of the random process does not work
• Each representation has a unique spectrum
• Need to determine how the average power of the process is
distributed in frequency.
2.8.8. Time averaging and Egordicity

• Time averaging:

• Ergodicity: must be stationary and time averaging = Statistical


average
2.8.9. Random process and LTI system

You might also like