0% found this document useful (0 votes)
71 views6 pages

Complex Random Variable 1

This document summarizes key topics from a lecture on complex random variables: 1) A complex random variable can be defined as the sum of a real and imaginary random variable, Z = X + jY. Its distribution is represented by the joint distribution of the real and imaginary parts. 2) A complex Gaussian random variable has a normal distribution for its real and imaginary parts. 3) A circularly symmetric complex random variable has the same distribution for all rotations in the complex plane. Its real and imaginary parts are uncorrelated with zero mean. 4) A circularly symmetric complex Gaussian has equal, uncorrelated normal distributions for its real and imaginary parts and zero mean.

Uploaded by

kudurupakamani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
71 views6 pages

Complex Random Variable 1

This document summarizes key topics from a lecture on complex random variables: 1) A complex random variable can be defined as the sum of a real and imaginary random variable, Z = X + jY. Its distribution is represented by the joint distribution of the real and imaginary parts. 2) A complex Gaussian random variable has a normal distribution for its real and imaginary parts. 3) A circularly symmetric complex random variable has the same distribution for all rotations in the complex plane. Its real and imaginary parts are uncorrelated with zero mean. 4) A circularly symmetric complex Gaussian has equal, uncorrelated normal distributions for its real and imaginary parts and zero mean.

Uploaded by

kudurupakamani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Wireless Communication (EC60298) spring 2023

Lecture 3: COMPLEX RANDOM VARIABLES


Instructor:Pechetti Sasi Vinay Scribe: GROUP 3 ( 16 JAN. 2024)

In this lecture, we’ll be exploring the interesting and tempting domain of COMPLEX RANDOM variables.
Now you’ll wonder can random variables be complex too? so the answer to this natural question is a very
big ”YES!”.

The best example a communication engineer can come up with is when we transmit symbols in QAM
we transmit ”In phase” and ”Quadrature component” of it ”(I+jQ)” which is very much a COMPLEX
RANDOM variable when received at the receiver end.

So now after understanding that Complex R.V does exist and are a quite important part of the field of
communication engineering a natural question arises that how can we define a complex R.V and how can we
come up with a Distribution function of a so-called Complex random variable

3.1 COMPLEX RANDOM VARIABLE

A complex random variable, Z ∈ C, is a sum


Z = X + jY, where X, Y are real random variables i.e X, Y ∈ R , and j = −1

At first sight, defining a complex random variable Z may seem impossible, since the seminal event {Z ≤ z}
makes no sense. Why?

”Because two complex numbers can’t be compared.” However, in the vector random variable framework, this
issue can be circumvented if one jointly describes the real and imaginary parts (or magnitude and phase) of Z.

The single complex random variable Z = X + jY, X, Y ∈ R, is completely represented by fX,Y (x, y)(proved
in the below section), which allows one to compute the expected value of a scalar function g(Z) as
Z ∞ Z ∞
E[g(Z)] = g(z) fX,Y (x, y) dx dy.
−∞ −∞

Moreover, we can devise general definitions for the mean and variance of Z as, respectively:

• Z̄ = X̄ + j Ȳ ,
2
• σZ = E[|Z − Z̄|2 ] = σX
2
+ σY2 ,

which measures the spread of Z about its mean in the complex plane.

3-1
3-2 Lecture 3: COMPLEX RANDOM VARIABLES

3.2 COMPLEX GAUSSIAN RANDOM VARIABLE

Let Z = X1 + jX2 , where Z ∈ C and X1 , X2 ∈ R.


such that [X1 , X2 ]T ∼ N (µx , k x ). Then ’Z’ is a Gaussian random variable.
Now a natural question arises what could be the ”pdf” of ’Z’ ?
This could be done by finding the Moment generating function(M.G.F) of ’Z’.

Mz (s) = Ez [ezs ]
= Ez [e(X1 +jX2 ) ] by the law of unconscious statistician
= EX1 ,X2 [es(X1 +jX2 ) ]
= EX1 ,X2 [esX1 · esX2 ]
= EX1 ,X2 [es(a1 X1 +a2 X2 ) ]
= MFX1 ,X2 (s1 , s2 ).

Hence from above we can conclude the statement mentioned in the first section that
fZ (z)= fX,Y (x, y)=fReZ,ImZ (Rez, Imz)

3.3 Circularly Symmetric Complex (CSC) random variable

say Z = X1 + jX2 is a complex random variable with pdf(Not necessarily gaussian) fZ (z)

then circularly symmetric R.V is r.v which follows the below property .that for

Y = ejθ Z
fY (y) = fZ (z)

fejθ Z (e z) = fZ (z), ∀θ

Consequences of the above equation is

E[Z] = E[Zejθ ]

E[Z] = e E[z]
(i) E[z] = 0 (Zero mean)
E[Re(z)] = E[Im(z)] = 0

2 2
E[|Z| ] = E[ Zejθ ]
= E[Zejθ e−jθ Z ∗ ]
2
= E[|Z| ] (Variance)
Lecture 3: COMPLEX RANDOM VARIABLES 3-3

(ii) E[Z 2 ] = E[(Zejθ )2 ]


= E[Z 2 ]e2jθ

(iii) E[Z 2 ] = 0 (Pseudo variance)


E[X12 − X22 + 2jX1 X2 ] = 0

E[X12 ] = E[X22 ]
= σ2
E[X1 X2 ] = 0

Real and imaginary parts are uncorrelated.

3.4 C.S. Complex Gaussian

z = x + jy ∼ (N (µz , kz ))
σx2 = σy2 = σ 2
µx = µy = 0

If you cut the joint pdf parallel to xy plane.

σx2 = σy2
µx = µy = 0
E[XY ] = 0 (Circularly symmetric gaussian)

σx2 ̸= σy2 (If not Circularly symmetric)


3-4 Lecture 3: COMPLEX RANDOM VARIABLES

Figure 3.1: Bell Shaped Pdf

Figure 3.2: Circularly symmetric Gaussian

Figure 3.3: If Not Circular Symmetric


Lecture 3: COMPLEX RANDOM VARIABLES 3-5

3.5 CIRCULARLY SYMMETRIC RANDOM VECTOR(C.S.)

Z = X + jY
 T
such that X Y is a random vector.

then
for C.S. random vector

f ejθ Z = f (Z)


⇒ E [Z] = 0
E Z Z T = 0N ×N ⇒ Pseudo Covariance Matrix
 

3.6 PROPER RANDOM VARIABLE

Any circular symmetric random variable(scalar/vector) with finite variance for each element is called ”Proper
Random Variable”

3.6.1 Property

if X is a proper random variable then


AX + b is also a proper random variable

3.7 ISOTROPIC RANDOM VECTOR

If Z is a complex random vector then Z is isotropic iff


[
f ( Z) = f (Z)

rotation over unit sphere doesn’t change the distribution


2
distribution also depends on ∥Z∥
3-6 Lecture 3: COMPLEX RANDOM VARIABLES

3.8 WHITE RANDOM VECTOR

Let us take a random vector X = [X1 , X2 , X3 , .....XN ]. If we take auto-correlation of any two random
variables, say, Xi , Xj ,then, E[Xi XjT ] = σ 2 δ[i − j].
If we take the auto-correlation of all the random variables,

 2 
σ 0 0 ..... 0
0
 σ2 0 ..... 0  
0 0 σ2 ..... 0 
T
 
.
E[X X ] =  . . ..... .  
. . . ..... . 
 
. . . ..... . 
0 0 0 ..... σ 2

from the above matrix, we can clearly say that the two different random variables of a white random vector
are uncorrelated.

Why WHITE?
We know that the Fourier transform of the auto-correlation function gives the power spectral density. If we
take the Fourier transform of the above auto-correlation function, we will get a flat function because the
Fourier Transform of the impulse function is a flat spectrum. Since, the obtained Power Spectral Density is
a flat spectrum, it contains all the frequency components. Therefore, it is a white random vector.

If a jointly Gaussian vector is white and zero mean , the elements of the vector are independent and
identically distributed with σ 2 variance and zero mean. (Because the joint density function depends on
covariance matrix and mean vector).

You might also like