Random Processes: Version 2, ECE IIT, Kharagpur
Random Processes: Version 2, ECE IIT, Kharagpur
2
Random Processes
¾ Uniform Distribution
¾ Binomial Distribution
¾ Poison Distribution
¾ Gaussian Distribution
¾ Central Limit Theorem
¾ Generation of Gaussian distributed random numbers using computer
¾ Error function
However, let, f(x) = 1/b for a < x < (a + b). It is easy to see that,
+∞ a +b
⎛1⎞
∫ f ( x ) dx = ∫ ⎜ ⎟dx = 1
⎝b⎠
2.7.1
−∞ a
⎧0 for x < a ;
x ⎪
⎪x−a
The cdf F ( x ) = ∫ f ( x ) dx = ⎨ for a < x < ( a + b ) ; 2.7.2
−∞ ⎪ b
⎪⎩1 for x > ( a + b ) ;
Binomial Distribution
This distribution is associated with discrete random variables. Let ‘p’ is the
probability of an event (say, ‘S’, denoting success) in a statistical experiment. Then, the
probability that this event does not occur (i.e. failure or ‘F’ occurs) is (1 – p) and, for
convenience, let, q = 1 – p, i.e. p + q = 1. Now, let this experiment be conducted
repeatedly (without any fatigue or biasness or partiality) ‘n’ times. Binomial distribution
tells us the probability of exactly ‘x’ successes in ‘n’ trials of the experiment (x ≤ n). The
corresponding binomial probability distribution is:
⎛ ⎞ ⎡ n! ⎤ x n− x
f ( x ) = ⎜ n ⎟ p x q n− x = ⎢ ⎥ p (1 − p ) 2.7.3
⎝ x⎠ ⎣ x !( n − x )!⎦
n n
⎛ ⎞
Note that, ∑ f ( x ) = ∑ ⎜ n ⎟ p x q n− x = 1
x =0 ⎝ ⎠
x =0
x
A simple form of the last term can be obtained by putting τ = 1 in the previous
expression resulting in a fairly easy-to-remember mean of the random variable:
The mean number of successes, E (S) = np
Following a similar approach it can be shown that, the variance of the number of
successes,
σ2 = npq.
An approximation
Poisson Distribution
A formal discussion on Poisson distribution usually follows a description of
binomial distribution because the Poisson distribution can be approximately viewed as a
limiting case of binomial distribution.
‘λ‘ in the above expression is the mean of the distribution. A special feature of this
distribution is that the variance of this distribution is approximately the same as its mean
(λ).
Poisson distribution plays an important role in traffic modeling and analysis in data
networks.
Fig. 2.7.1 shows two Gaussian pdf curves with means 0.0 and 1.5 but same
variance 1.0. The particular curve with m = 0.0 and σ2 = 1.0 is known as the normalized
Gaussian pdf. It may be noted that a pdf curve is symmetric about its mean value. The
mean may be positive or negative. Further, a change in the mean of a Gaussian pdf curve
only shifts the curve horizontally without any change in shape of the curve. A smaller
variance, however, increases the sharpness of the peak value of the pdf which always
occurs at the average value of the random variable. This explains the significance of
‘variance’ of a distribution. A smaller variance means that a random variable mostly
assumes values close to its expected or mean value.
Fig.2.7.1. Gaussian pdf with mean = 0.0 and variance = 1.0 and Gaussian pdf with
mean = 1.5 and variance = 1.0
The Central Limit Theorem is very useful in modeling and analyzing several
situations in the study of electrical communications. However, one necessary condition to
look for before invoking Central Limit theorem is that no single random variable should
have significant contribution to the sum random variable.
A faster and more precise way for generating Gaussian random variables is
known as the Box-Müller method. Only two uniform distributed random variables say,
X1 and X2, ensure generation of two (almost) independent and identically distributed
Gaussian random variables (N1 and N2). If x1 and x2 are two uncorrelated values assigned
to X1 and X2 respectively, the two independent Gaussian distributed values n1 and n2 are
obtained from the following relations:
n1 = √[ -2.(lnx1).cos2πx2 ]
n2 = √[ -2.(lnx2).sin2πx1 ]
2.7.7
It may be noted that the above definite integral is not easy to calculate and tables
containing approximate values for the argument ‘x’ are readily available. A few
properties of the error function erf(x) are noted below:
⎟⎟ = e − z dz 2.7.9
⎝ 2σ x ⎠ π 0
The complementary error function is directly related to the error function erf(x):
∞ ∞
2 2
erfc ( u ) = ∫ exp ( − z ) dz = ∫e dz = erfc ( u ) = 1 − erf ( u )
2 − z2
π u π u
2.7.10
exp ( −v 2 ) ⎛ 1 ⎞ exp ( −v 2 )
⎜ −
1 ⎟< erfc ( )<
v
π ⋅ v ⎝ 2v 2 ⎠ π ⋅v
2.7.11
Problems
Q2.7.1) If a fair coin is tossed ten times, determine the probability that exactly two
heads occur.