Comm Ch02 Random en 2
Comm Ch02 Random en 2
Meixia Tao
Dept. of Electronic Engineering Shanghai Jiao Tong University Chapter 2: Signal, Random Process, and Spectra Textbook: 2 2.1 1-2 2.6, 6 5 5.1 1-5 5.3 3
1
Topics to be Covered
2.1.Signals 2.2.Review of probability and random variables 2.3.Random Processes: basic concepts 2.4.Guassian and White Processes
What is Signal?
Any physical quantity that varies with time, space, or any other independent variables is called a signal In communication systems systems, signals are used to transmit information over a communication channel. Such signals are called information-bearing signals
Classification of Signals
Signals can be characterized in several ways
Continuous-time signal vs. discrete-time signal Continuous valued signal vs vs. discrete-valued signal
o o o o Continuous-time and continuous valued: analog signal (speech) Discrete-time and discrete valued: digital g signal g (CD) ( ) Discrete-time and continuous valued: sampled signal Continuous-time and discrete valued: quantized signal
Ex =
T /2
T /2
x(t ) 2 dt = lim
T /2
x(t ) 2 dt
1 2 Px = lim x ( t ) dt T T T /2
A signal is an energy signal if and only if Ex is finite A signal is a power signal if and only if Px is finite Physically realizable waveforms are of energy-type Mathematical models are often of power-type
Probability
Let A be an event in a sample space S The probability P(A) is a real number that measures the th likelihood lik lih d of f the th event tA Axioms A i of fP Probability b bilit
1) 2) 3) Let A and B are two mutually exclusive events, i.e.
Then
2010/2011 Meixia Tao @ SJTU
If
then
10
Conditional Probability
Consider two events A and B in a random experiment p The probability that event A will occur GIVEN that B has occurred, P(A|B), is called the conditional probability The probability that both A and B occur, is called the joint probability Joint J i t and d conditional diti l probabilities b biliti are related l t db by
P( AB ) = P( B ) P( A | B ) = P( A) P( B | A) P( AB ) P( AB ) P ( A | B ) = Alternatively, P( B | A) = P( B ) P( A) Two events A and B are said statistically independent iff
then
and
2010/2011 Meixia Tao @ SJTU
11
12
This formula will be used to derive the structure of the optimal receiver
2010/2011 Meixia Tao @ SJTU
13
Example
Consider a binary communication system
P(0) = 0.3 P(1) = 0.7 07
P01 = P(receive 1 | sent 0) = 0.01 P00 = P(receive 0 | sent 0) = 1- P01 = 0.09 P10 = P(receive 0 | sent 1) = 0.1 P11 = P(receive 1 | sent 1) = 1- P10 = 0.9
What is the probability that the output of this channel is 1? Assuming that we have observed a 1 at the output output, what is the probability that the input to the channel was a 1?
14
A r.v. r v may be Discrete-valued: range is finite (e.g. {0,1}), or countable infinite (e.g. (e g {1,2,3 {1 2 3 }) }) Continuous-valued: range is uncountable infinite (e.g. )
2010/2011 Meixia Tao @ SJTU
15
The Cumulative distribution function (CDF), or simply the probability distribution of a r.v. X, is
FX ( x ) = P ( X x )
16
d f X ( x ) = FX ( x ) dx
p X ( x) 0
or
FX ( x ) = f X ( y )dy
x
2.
p X ( x)dx = 1
3. P ( x1 < X x2 ) = PX ( x2 ) PX ( x1 ) =
x2
x1
p X ( x)dx
Area = P ( x1 < X x2 )
1
FX ( x )
1
f X (x )
P( x1 < X x2 )
Xmin 0 x1 x2 Xmax x Xmin 0
Area = f X ( x )dx
d Xmax x1 x2 x x+dx
x
17
Joint Distribution
In many situation, one must consider TWO or more r.v.s r.v. s joint distribution function Consider 2 r r.v. v s s X and Y, Y joint distribution function is defined as F ( x, y ) = P( X x, Y y )
XY
2 FXY ( x, y ) f XY ( x, y ) = xy
p XY ( x, y )dxdy = 1
y2 y1
P( x1 < X x2 , y1 < Y y2 ) =
x2
x1
p XY ( x, y )dxdy
18
Marginal distribution
PX ( x) = P( X x, < Y < ) =
PY ( y ) =
y
p XY ( , )dd
p XY ( , )dd
Marginal density
p X ( x) = p XY ( x, )d
19
Statistical Averages
Let X be a r r.v. v Then X can either continuous or discrete. For the moment, consider a discrete r.v. which takes on the p possible values x1, x2, , , xM with respective probabilities P1, P2, , PM. Then the mean or expected value of X is
m X = E [X ] = xi Pi
i =1 M
20
This is the first moment of the random variable X Let g( g(X) ) be a function of X, , then
E [g ( X )] = g ( x) p X ( x)dx
F For the th special i l case of f g(X) (X) =xn, we obtain bt i th the nth moment of X, that is
EX
[ ]=
n
x n p X ( x )dx
[ ]=
2
x 2 p X ( x )dx
2010/2011 Meixia Tao @ SJTU
21
E ( X mX ) =
n
2 X = E [( X mX )2 ]
[ = E [X ] m
2
= E X 2 2 mX X + m2 X
2 X
X, square-root of the variance, is called the standard deviation. It is the average distance from the mean, a measure of the concentration of X around the mean
2010/2011 Meixia Tao @ SJTU
22
Correlation
In considering multiple varaibles, the joint moments like correlation and covariance between pairs of r.v.s are most useful Correlation of the two r.v. X and Y is defined as
RXY = E [ XY ] =
xyf XY ( x, y )dxdy
Correlation of X and Y is the mean of the product X and Y Correlation of the two centered r.v. X-E[X] and Y-E[Y], is called the covariance of X and Y
Cov XY = E [( X E[ x])(Y E[ y ])]
23
The covariance of X and Y normalized w w.r.t. r t X Y is referred to the correlation coefficient of X and Y:
XY =
C ( XY ) Cov
XY
X and Y are uncorrelated iff their correlation coefficient is 0 XY = 0 X and Y are orthogonal iff their correlation is 0
R XY = E[ XY ] = 0
If X and Y are independent, p , then they y are uncorrelated. However, the converse is not true (The Gaussian case is the only exception)
24
Continuous Distribution
Uniform distribution Gaussian distribution (most important one) Rayleigh distribution (very important in mobile and wireless i l communications) i ti )
25
Binary Distribution
Let X be a discrete random variable that has two possible values, say X = 1 or X = 0. Distribution of X can be described by yp probability y mass function (p (pmf) )
26
Binomial Distribution
Let , where binary r.v.s with are independent
Then Th
where
That is, the probability that Y = k is the probability that k of the Xi are equal q to 1 and n-k are equal q to 0 Mean: Variance: V i
27
Example
Suppose that we transmit a 31-bit 31 bit long sequence with error correction capability up to 3 bit errors If the probability of a bit error is p = 0.001, 0 001 what is the probability that this sequence is received in errors?
28
Uniform Distribution
The Th pdf df of f uniform if distribution di t ib ti i is given i b by
Any example?
29
Gaussian Distribution
The Gaussian distribution, also called normal distribution, is by y far the most important p distribution in the statistical analysis of communication systems The PDF of a Gaussian r.v. is
1 2 f X ( x) = exp 2 ( x mX ) 2 2 X 2 X 1
A Gaussian r.v. is completely determined by its mean and variance, i and dh hence usually ll d denoted t d as
2 ) X ~ N (m X , X
pX(x)
mX
x
30
u2 1 exp du 2 2
The Q-function is the area under the tail of a Gaussian p pdf with mean zero and variance one
32
33
Then
34
35
Rayleigh Distribution
0.7
0.6 0.5
0.4
0.3 0.2
0.1
0 0 1 2 3 4 5 6
Rayleigh distributions are frequently used to model fading for non non-line line of sight (NLOS) signal transmission Very important for mobile and wireless communications
2010/2011 Meixia Tao @ SJTU
36
So what?
37
38
Example
39
40
41
42
43
44
Random Process
A random process is the natural extension of random variables when dealing with signals Also referred to as stochastic process or random signal Voices signals, TV signals, thermal noise generated by a radio receiver are all examples of random signals signals.
45
A random process can be described as For each experiment n, there exists a time-function xn(t) , called a sample function or realization of the random process At any time instant t1, t2, , the value of the random process is a random variable X(t1), ) X(t2), ) ,
x1(t)
Outcome of 1st experiment i t Outcome of 2nd experiment
46
X(t1)
t1
X(t2)
t2
47
A random process X(t) is described by its M-th order statistics if for all and all the joint pdf of {X(t1), ) X(t2), ) , X(tn)} is given This joint pdf is written as
f ( x1 , x2 ,K, xn ; t1 , t2 ,K, tn )
In order to completely specify a random process process, one must given f (x1 , x2 ,K, xn ; t1 , t2 ,K, tn ) for all possible values of {xi} {ti}, } and for all n n. This is obviously quite difficult in general
48
Variance
49
x1 x2 f ( x1 , x2 ; t , t + )dx1dx2
The physical meaning of RX(t; ) is a measure of the relationship p of the function X(t) ( ) and X(t+ ( ) (correlation ( within a process) In general, the autocorrelation function is a function of both t and .
50
Example
Given a stochastic process is a random variable uniformly distributed from At each h t, X(t) can b be viewed i d as a f function ti of f The mean is The auto-correlation is , where to
51
Stationary Processes
A stochastic process is said to be stationary if for any n and the following holds:
f X ( x1 , x2 ,L xn ; t1 , t2 ,L tn ) = f X ( x1 , x2 ,L xn ; t1 + , t2 + ,L tn + ) (1)
Therefore,
Te first-order statistics is independent of t
mean E { X (t )} =
xf X ( x)dx = mX
(2) ( )
1 2
= RX (t2 t1 ) = RX ( ) ,
53
X (t ) = E [X (t )] = xp( x; t )dx
RX (t1 , t 2 ) = E [X (t1 ) X (t 2 )] =
x1 x2 p ( x1 , x2 ; t1 , t 2 )dx1dx2
Time averaging g g 1 T < X (t ) > = lim x(t )dt T 2T T 1 T < X (t ) X (t ) > = lim li x(t ) x(t )dt T 2T T In general, ensemble averages and time averages are not equal A r.p. X(t) is said to be Ergodic if all time averages and Ensemble averages are equal
2010/2011 Meixia Tao @ SJTU
54
in order to get an energy signal. Performing a Fourier transform on According to Parseval theorem 2 2 x ( t , n ) dt = X ( f , n ) df T T
, we get
55
Then the power spectral density is the average energy spectral density per time unit, i.e.
Letting , we define the power spectral density for the sample function: 2
X T ( f , n) S X ( f , n) = lim T T
If we take the ensemble average, the power spectral density (PSD) of the random process is
(4)
Watts/Hz
56
S X ( f ) = RX ( ) exp( j 2f )d
In general, is a measure of the relative power in the random d signal i l at each hf frequency component
RX (0) = S X ( f )df = total power
57
Gaussian Process
The importance of Gaussian processes in communication systems is due to that thermal noise in electronic devices can be closely modeled by a Gaussian process
Definition:
A random process X(t) is a Gaussian process if for all n and all (t1, t2, , tn), the random variables {X(t1), X(t2), ( ), , , X(tn)} ( )} have a joint j Gaussian density y function
58
59
Y(t)
Y (t ) = X (t ) * h(t ) = h( ) X (t )d
If X(t) is WSS
= h( ) X (t )d
= X h( )d = X H (0)
= h( 1 )d 1 h( 2 ) E [X (t 1 ) X (u 2 )]d 2
If X(t) is WSS
RY ( ) =
h( 1 )h( 2 ) RX ( 1 + 2 )d 1d 2
If input is a WSS random process, the output is also a WSS random process
2010/2011 Meixia Tao @ SJTU
61
h( 1 )h( 2 ) RX ( 1 + 2 )d 1d 2
= h( 2 )d 2 h( 1 ) RX ( + 2 1 )d 1
= h( 2 )[h( + 2 ) * RX ( + 2 )]d 2
= h( ) * h( ) * RX ( )
PSD of Y(t): SY ( f ) = H ( f ) 2 S X ( f )
X(t)
SX ( f )
h(t)
2
Y(t)
SY ( f )
Key ey Results esu ts
62
SY ( f ) = H ( f ) S X ( f )
Noise
Noise is a critical component in the analysis of the performance of communication receivers Often assumed to be Gaussian and stationary The mean is taken to be zero while the autocorrelation is usually specified by the power spectral density The noise is a white noise, when all frequency components appear with equal power (white is used in white light for a similar reason) N Sn ( f ) = N 0 2 Rn ( ) = 0 ( ) 2 White noise is completely
Sn ( f ) Rn ( )
uncorrelated!
N0 2
N0 2
63
Bandlimited Noise
White noise Filter Bandwidth Hz Bandlimited white noise n(t)
At what sampling rate to sample the noise can we get uncorrelated realizations?
2010/2011 Meixia Tao @ SJTU
64
fc
65
Proof:
E {nc (t )} = E {ns (t )} = 0
2010/2011 Meixia Tao @ SJTU
66
Result 2
S n ( f f 0 ) + S n ( f + f 0 ), f B / 2 S nc ( f ) = S ns ( f ) = otherwise h i 0
Proof
n(t )
Z (t)
1
HL ( f )
1 B/2 f
nc (t )
2cos 0t 2sin 0t
-B/2 0
Z2 (t )
HL ( f )
1 B/2 f
ns (t )
-B/2 0
67
Result 3for the same t t, nc(t) and ns(t) are uncorrelated or independent
Rnc ns (0) = 0
Result 4
E {n 2 (t )} = E {nc 2 (t )} = E {ns 2 (t )} = 2
68
envelop phase h
where
(t ) = tan 1
ns (t ) nc (t )
1 B
[0 (t ) 2 ]
R(t )
n(t )
1 f0
69
Let n(t) be a zero-mean, stationary Gaussian process, find the statistics of the envelop and phase
Result Envelop follows Rayleigh distribution while phase follows uniform distribution
f ( R) =
2 0
P f Proof
f ( ) =
R2 f ( R, )d = 2 exp 2 2 1 f ( R, ) dR = 0 2 2 R
R0
For o the e sa same e t, , the ee envelop e op variable a ab e R a and dp phase ase variable are independent (but not the two processes)
70