Complex Random Variable 1
Complex Random Variable 1
In this lecture, we’ll be exploring the interesting and tempting domain of COMPLEX RANDOM variables.
Now you’ll wonder can random variables be complex too? so the answer to this natural question is a very
big ”YES!”.
The best example a communication engineer can come up with is when we transmit symbols in QAM
we transmit ”In phase” and ”Quadrature component” of it ”(I+jQ)” which is very much a COMPLEX
RANDOM variable when received at the receiver end.
So now after understanding that Complex R.V does exist and are a quite important part of the field of
communication engineering a natural question arises that how can we define a complex R.V and how can we
come up with a Distribution function of a so-called Complex random variable
√
Z = X + jY, where X, Y are real random variables i.e X, Y ∈ R , and j = −1
At first sight, defining a complex random variable Z may seem impossible, since the seminal event {Z ≤ z}
makes no sense. Why?
”Because two complex numbers can’t be compared.” However, in the vector random variable framework, this
issue can be circumvented if one jointly describes the real and imaginary parts (or magnitude and phase) of Z.
The single complex random variable Z = X + jY, X, Y ∈ R, is completely represented by fX,Y (x, y)(proved
in the below section), which allows one to compute the expected value of a scalar function g(Z) as
Z ∞ Z ∞
E[g(Z)] = g(z) fX,Y (x, y) dx dy.
−∞ −∞
Moreover, we can devise general definitions for the mean and variance of Z as, respectively:
• Z̄ = X̄ + j Ȳ ,
2
• σZ = E[|Z − Z̄|2 ] = σX
2
+ σY2 ,
which measures the spread of Z about its mean in the complex plane.
3-1
3-2 Lecture 3: COMPLEX RANDOM VARIABLES
Mz (s) = Ez [ezs ]
= Ez [e(X1 +jX2 ) ] by the law of unconscious statistician
= EX1 ,X2 [es(X1 +jX2 ) ]
= EX1 ,X2 [esX1 · esX2 ]
= EX1 ,X2 [es(a1 X1 +a2 X2 ) ]
= MFX1 ,X2 (s1 , s2 ).
Hence from above we can conclude the statement mentioned in the first section that
fZ (z)= fX,Y (x, y)=fReZ,ImZ (Rez, Imz)
say Z = X1 + jX2 is a complex random variable with pdf(Not necessarily gaussian) fZ (z)
then circularly symmetric R.V is r.v which follows the below property .that for
Y = ejθ Z
fY (y) = fZ (z)
jθ
fejθ Z (e z) = fZ (z), ∀θ
E[Z] = E[Zejθ ]
jθ
E[Z] = e E[z]
(i) E[z] = 0 (Zero mean)
E[Re(z)] = E[Im(z)] = 0
2 2
E[|Z| ] = E[ Zejθ ]
= E[Zejθ e−jθ Z ∗ ]
2
= E[|Z| ] (Variance)
Lecture 3: COMPLEX RANDOM VARIABLES 3-3
E[X12 ] = E[X22 ]
= σ2
E[X1 X2 ] = 0
z = x + jy ∼ (N (µz , kz ))
σx2 = σy2 = σ 2
µx = µy = 0
σx2 = σy2
µx = µy = 0
E[XY ] = 0 (Circularly symmetric gaussian)
Z = X + jY
T
such that X Y is a random vector.
then
for C.S. random vector
f ejθ Z = f (Z)
⇒ E [Z] = 0
E Z Z T = 0N ×N ⇒ Pseudo Covariance Matrix
Any circular symmetric random variable(scalar/vector) with finite variance for each element is called ”Proper
Random Variable”
3.6.1 Property
Let us take a random vector X = [X1 , X2 , X3 , .....XN ]. If we take auto-correlation of any two random
variables, say, Xi , Xj ,then, E[Xi XjT ] = σ 2 δ[i − j].
If we take the auto-correlation of all the random variables,
2
σ 0 0 ..... 0
0
σ2 0 ..... 0
0 0 σ2 ..... 0
T
.
E[X X ] = . . ..... .
. . . ..... .
. . . ..... .
0 0 0 ..... σ 2
from the above matrix, we can clearly say that the two different random variables of a white random vector
are uncorrelated.
Why WHITE?
We know that the Fourier transform of the auto-correlation function gives the power spectral density. If we
take the Fourier transform of the above auto-correlation function, we will get a flat function because the
Fourier Transform of the impulse function is a flat spectrum. Since, the obtained Power Spectral Density is
a flat spectrum, it contains all the frequency components. Therefore, it is a white random vector.
If a jointly Gaussian vector is white and zero mean , the elements of the vector are independent and
identically distributed with σ 2 variance and zero mean. (Because the joint density function depends on
covariance matrix and mean vector).