Probability Theory and Stochastic Process (EC305ES)
Probability Theory and Stochastic Process (EC305ES)
Probability Theory and Stochastic Process (EC305ES)
PROCESS
Proficiency
Program Specific Outcomes Level
assessed by
Professional Skills: An ability to understand the basic
concepts in Electronics & Communication Engineering and to
apply them to various areas, like Electronics, Lectures
PSO 1 3
Communications, Signal processing, VLSI, Embedded and
systems etc., in the design and implementation of complex Assignmen
systems. ts
Problem-Solving Skills: An ability to solve complex
Electronics and communication Engineering problems, using
PSO 2 3
latest hardware and software tools, along with analytical skills Tutorials
to arrive cost effective and appropriate solutions.
Successful Career and Entrepreneurship: An understanding
of social-awareness & environmental-wisdom along with
Seminars
PSO 3 ethical responsibility to have a successful career and to sustain 2
and
passion and zeal for real-world applications using optimal
Projects
resources as an Entrepreneur.
1: Slight (Low) 2: Moderate (Medium) 3: Substantial (High) - : None
VII. COURSE SYLLABUS:
UNIT – I
Probability & Random Variable: Probability introduced through Sets and Relative Frequency:
Experiments and Sample Spaces, Discrete and Continuous Sample Spaces, Events, Probability
Definitions and Axioms, Joint Probability, Conditional Probability, Total Probability, Bay’s
Theorem, Independent Events, Random Variable- Definition, Conditions for a Function to be a
Random Variable, Discrete, Continuous and Mixed Random Variable, Distribution and Density
functions, Properties, Binomial, Poisson, Uniform, Gaussian, Exponential, Rayleigh, Methods of
defining Conditioning Event, Conditional Distribution, Conditional Density and their Properties.
UNIT – II
Operations On Single & Multiple Random Variables – Expectations: Expected Value of a
Random Variable, Function of a Random Variable, Moments about the Origin, Central Moments,
Variance and Skew, Chebychev’s Inequality, Characteristic Function, Moment Generating
Function, Transformations of a Random Variable: Monotonic and Non-monotonic
Transformations of Continuous Random Variable, Transformation of a Discrete Random
Variable. Vector Random Variables, Joint Distribution Function and its Properties, Marginal
Distribution Functions, Conditional Distribution and Density – Point Conditioning, Conditional
Distribution and Density – Interval conditioning, Statistical Independence. Sum of Two Random
Variables, Sum of Several Random Variables, Central Limit Theorem, (Proof not expected).
Unequal Distribution, Equal Distributions. Expected Value of a Function of Random Variables:
Joint Moments about the Origin, Joint Central Moments, Joint Characteristic Functions, Jointly
Gaussian Random Variables: Two Random Variables case, N Random Variable case, Properties,
Transformations of Multiple Random Variables, Linear Transformations of Gaussian Random
Variables.
UNIT – III
Random Processes – Temporal Characteristics: The Random Process Concept, Classification of
Processes, Deterministic and Nondeterministic Processes, Distribution and Density Functions,
concept of Stationarity and Statistical Independence. First-Order Stationary Processes,
SecondOrder and Wide-Sense Stationarity, (N-Order) and Strict-Sense Stationarity, Time
Averages and Ergodicity, Mean-Ergodic Processes, Correlation-Ergodic Processes,
Autocorrelation Function and Its Properties, Cross-Correlation Function and Its Properties,
Covariance Functions, Gaussian Random Processes, Poisson Random Process. Random Signal
Response of Linear Systems: System Response – Convolution, Mean and Mean-squared Value of
System Response, autocorrelation Function of Response, Cross-Correlation Functions of Input and
Output.
UNIT – IV
Random Processes – Spectral Characteristics: The Power Spectrum: Properties, Relationship
between Power Spectrum and Autocorrelation Function, The Cross-Power Density Spectrum,
Properties, Relationship between Cross-Power Spectrum and Cross-Correlation Function. Spectral
Characteristics of System Response: Power Density Spectrum of Response, Cross-Power Density
Spectrums of Input and Output.
UNIT – V
Noise Sources & Information Theory: Resistive/Thermal Noise Source, Arbitrary Noise
Sources, Effective Noise Temperature, Noise equivalent bandwidth, Average Noise Figures,
Average Noise Figure of cascaded networks, Narrow Band noise, Quadrature representation of
narrow band noise & its properties. Entropy, Information rate, Source coding: Huffman coding,
Shannon Fano coding, Mutual information, Channel capacity of discrete channel, Shannon-Hartley
law; Trade -off between bandwidth and SNR.
TEXT BOOKS:
1. Probability, Random Variables & Random Signal Principles - Peyton Z. Peebles, TMH, 4th
Edition, 2001.
2. Principles of Communication systems by Taub and Schilling (TMH), 2008
REFERENCE BOOKS:
1. Random Processes for Engineers-Bruce Hajck, Cambridge unipress,2015.
1. Probability, Random Variables and Stochastic Processes – Athanasios Papoulis and S.
Unnikrishna Pillai, PHI, 4th Edition, 2002.
2. Probability, Statistics & Random Processes-K. Murugesan, P. Guruswamy, Anuradha
Agencies, 3rd Edition, 2003.
3. Signals, Systems & Communications - B.P. Lathi, B.S. Publications, 2003.
4. Statistical Theory of Communication – S.P Eugene Xavier, New Age Publications, 2003
IES SYLLABUS & GATE SYLLABUS: Not Applicable
CASE STUDIES:
1. Study Of STOCHASTIC PROCESS – TEMPORAL CHARACTERISTICS
2. Study Of STOCHASTIC PROCESS – SPECTRAL CHARACTERISTICS
3. Study of Noise.
VIII. COURSE PLAN (WEEK-WISE)
The course will proceed as follows for all sections. Please note that the week and the classes
in each week are relative to each section.
Lecture Date/Week Topic Reference
Class
UNIT-I Probability Peebles
1. Probability introduced through
Sets and Relative frequency ,
2. Experiments and sample spaces,
Week – 1 Discrete and continuous sample
spaces
3. Events, probability definitions
and Axioms
4 Events, probability definitions
and Axioms
5. Mathematical model of
experiments
6. Week – 2 Probability as relative frequency
7. Joint probability, conditional
probability
8. Total probability, Baye’s
theorem, Independent events
9. Random Variable: definition
10. Conditions for function to be a Peebles
Week – 3 random variable,
11 Discrete random variable
12. Continuous and mixed random
variable
UNIT- II: Operations on one
random variable
13. Distribution and density
functions
14. Week – 4 Properties
ANALYTICAL QUESTIONS
1
Apply
3
ANALYTICAL QUESTIONS
The joint PDF of X and Y is f(X,Y) (x, y) = 5y/4 −1 ≤ x ≤ 1, Analysis 2
x2 ≤ y ≤ 1, 0 otherwise.
1 Find the marginal PDFs fX (x) and fY (y)
The joint probability density function of random variables Analysis 2
X and Y is fX,Y (x, y) = 6(x + y )/5 0 ≤ x ≤ 1, 0 ≤ y ≤ 1,
2
2 = 0 otherwise.
X and Y have joint Pdf fX,Y (x, y) = -1/15 0 ≤ x ≤ 5, 0 ≤ y Analysis 2
3 ≤ 3, 0 otherwise. Find the PDF of W = max(X, Y)
UNIT IV
Stochastic Processes – Temporal Characteristics:
SHORT ANSWER TYPE QUESTIONS
Define random process? Remember 2
1
2 Define ergodicity? Remember 2
Define mean ergodic process? Remember 2
3
4 Define correlation ergodic process? Remember 2
ANALYTICAL QUESTIONS
∞
(b) e-b ∑ bk u(x-k)
k=0
∞
(c) e-b ∑ bk /k!
k=0
∞
(d) e ∑ bk /k! u(x-k)
-b
k=0
∞
(b) ∑ P( X i) u(x- xi)
i=1
∞
(c) ∑ P( X i) u(xi)
i=1
∞
(d) ∑ P( X i) u(xi)
6. The distribution function of Gaussian RV is [ ]
(a) F(x/ x) (b) F((x+ax)/ x) (c) F((x-ax)/ x) (d) F(x-ax)
7. The uniform probability density function defined by [ ]
(a) ab (b) 1/b-a for a≤x≤b (c) 1/b+a (d) b/a
8. A continuous RV is one having [ ]
(a) Continuous range of value (b) -∞ to
(c) Containing (d) some con
9. A mixed random RV is one having [ ]
(a) Discrete values only (b) - ∞ to 0 only
(c) Both continuous and discrete (d) continuous values only
10. A real RV defined as [ ]
(a) A real function of the elements of Sample space.
(b) A discrete function of the elements
(c) A complex function of the elements
(d) A complex function of the elements of Sample space
Fill in the blanks
11. A discrete random variable can take a …………… number of values within its range
12. A continuous random variable can take any value within its ……………
13. The probability mass function is also called as …………….
14. The value of P(X=-∞) = P(X=∞) is ……………
15. The value of CDF FX (-∞) is …….. And FX(∞) is ……..
16. The PDF satisfy the relation …………..
17. A random variable with a uniform distribution is an example of …………
18. The Gaussian distribution function is defined as …………….
19. The PDF of sum of a large number of RV’s approaches a …………..
20. Probability distribution function FX(x) is ………..
Key:
1. B 11. Finite
2. D 12. Domain or range
3. D 13. PDF
4. D 14. Zero
5. B 15. 0 and 1
6. A ∞
16. ∫ f(x) dx = 1
-∞
7. B 17. Continuous type
8. A 18. Continuous random variable
9. C 19. Gaussian distribution
10. A 20. P(X≤x)
∞ ∞
(c) ∑ P( X i) u(xi) (d) ∑ P( X i) u(xi)
i=1 i=1
6. MGF is given by MX(V) is [ ]
(a) E[ev] (b) E[evX] V
(c)e x 2x
(d)E(e )
Answer: 1. B 2. B 3. D 4. D 5. B 6. B
7.C 8. C 9. D 10. D
11. 1 ∞
12. ∫ (x- x )n fx(x) dx 13. Variance 14. F ((x-ax)/σx)
-∞
15. fX(x) dx/dy N 17. 1/(b-a) for a≤x≤b
16. ∑ xiP(xi)
i=1 18. Binomial coefficient
20. a x 2
N
19. ∑ g(xi)P(xi)
i=1
(a) (b)
(c) (d)
3. The (n + K)th order joint moment of two R.V's X and Y is defined as
(a) (b)
(c) (d)
4. The (n + K)th order joint central moment of the R.V's X and Y is defined as
(a) (b)
(c) (d)
5. Which of the following Relation is correct
(a) (b)
(c) (d)
6. The joint moments can be found from the joint characteristic function as
(a) (b)
(c) (d)
(a) (b)
(c) (d)
9. The expression for joint characteristic function of two r.V's X and Y in integral form is
X,Y (w1, w2) =
(a) (b)
(c) (d)
10. The expression for joint characteristic function Ф X,Y (w1, w2) is recognized as the two
dimensional
(a) Fourier transform of joint density function
(b) Fourier transform of joint distribute function
(c) Inverse Fourier transform of joint distribute functions
(d) Inverse Fourier transform of joint density function
11. The marginal density function of X is given by fX(x) =
(a) (b)
(c) (d)
12. X and Y are Gaussian r.v's with variances and . Then the R.V's V = X + KY and w
= X - KY are statistically independent for K equal to
(a)
(b)
(c)
(d)
15. Gaussian R.V's are completely defined through only their
(a) first order moments (b) Second order moments
(c) First order moments & Second order moments (d) Covariance
16. X and Y are two independent normal r.v's N(m, σ2) = N(0, 4). Consider V = 2X + 3Y is a
R.V.
(a) Rayleigh (b) Gaussian (c) poison (d) Binomial
17. If FX(x1, x2 : t1, t2) is referred to as second order joint distribution function then the
corresponding joint density function is
(a) (b)
(c) (d)
18. The mean of random process X(t) is the expected value of the R.V X at time t i.e., the
mean m(t) =
24. X and Y are R.V's transformed as and y = V. The jacobian of this transformation is
(c) (d)
10. A stationary continuous process X(t) with auto-correlation function RXX(τ) is called
autocorrelation-ergodic or ergodic in the autocorrelation if, and only if, for all τ
(a) (b)
(c) (d)
11. A random process is defined as X(T) = A cos(wt + θ), where w and θ are constants and A
is a random variable. Then, X(t) is stationary if
(a) F(A) = 2 (b) F(A) = 0
(c) A is Gaussian with non-zero mean (d) A is Reyleigh with non-zero mean
12. For an ergodic process
(a) mean is necessarily zero (b) mean square value is infinity
(c) all time averages are zero (d) mean square value is independent if sine
13. Two processes X(t) and Y(t) are statistically independent if
(a)
(b)
(c)
(d)
14. Let X(t) is a random process which is wide sense stationery then
(a) E[X(t)] = constant (b) E[X(t) . X[t + τ)] = RXX (τ)
(c) E[X(t)] = constant and E[X(t) . X[t + τ)] = RXX (τ) (d) E[X2(t)] = 0
15. A process stationary to all orders N = 1, 2, ---- &. For Xi = X(ti) where i = 1, 2, -------N is
called
(a) Strict-sense stationary (b) wide-sense stationary
(c) Strictly stationary (d) independent
16. Time average of a quantity x(t) is defined as A[x (t)] =
(a) (b)
(c) mean is 0 (d) constant mean
Answers: 1.C 2.C 3.A 4.B 5.A 6.C
7. C 8.B 9.C 10.B 11.B 12.D
13.D 14.C 15.A 16.C 17.A 18.A
19.B 20.C 21.D 22.C 23.D 24.A
25.B
Unit-VII: STOCHASTIC PROCESSES-SPECTRAL CHARACTERISTICS
1. The output of a filter is given by Y(t) = X(t + T) = X(t - T). Where X(t) is a WSS process
with power spectrum SXX(w) and T is a constant the power spectrum of Y(t) is
(a) (b)
(c) (d)
2. The PSD of a random process whose auto correlation function is is
(a) (b)
(c) (d)
(c) (d)
6. The average power of the above described random process is =
(a) (b)
(c) (d)
9. PSD of a WSS is always
(a) Negative (b) non-negative
(c) Positive (d) can be negative or positive
10. The average power of the periodic random process signal whose auto correlation function
is
(a) 0 (b) 1 (c) 2 (d) 3
11. If and and X(t) and Y(t) are of zero mean, then let
U(t) = X(t) + Y(t). Then SXU (w) is
(a) (b)
(c) (d)
13. A random process is given by Z(t) = A. X(t) + B.Y(t) where 'A' and 'B' are real constants
and X(t) and Y(t) are jointly WSS processes. The power spectrum SZZ(w) is
(a) AB SXY (w) + AB SYX (w) (b) A2 + B2, + AB SXY (w) + AB SYX (w)
(c) A SXX(w) + AB SXY (w) + AB SYX (w) + B2 SYY (w)
2
(d) 0
14. A random process is given by Z(t) = A. X(t) + B.Y(t) where 'A' and 'B' are real constants
and X(t) and Y(t) are jointly WSS processes. If X(t) and Y(t) are uncorrelated then S ZZ
(w)
(a) A2 + B2 2 2
(b) A SXX (w) + B SYY (w)
(c) AB SXY(w) + AB SYX (w) (d) 0
15. PSD is _________ function of frequency
(a) even (b) odd (c) periodic (d) asymmetric
16. For a WSS process, PCD at zero frequency gives
(a) The area under the graph of power spectral density (c) Mean of the process
(b) Area under the graph auto correlation of the process (d) variance of the process
17. The mean square value of WSS process equals
(a) The area under the graph of PSD (c) mean of the process
(b) The area under the graph of auto correlation of process (d) zero
18. X(T) = A cos (wot + θ), where A and wo are constants and θ is a R.V uniformly
distributed over (0, π). The average power of X(t) is
(a) (b)
(c) (d)
22. If X(t) and Y(t) are uncorrelated and of constant means E(X) and E(Y), respectively then
SXY (w) is
(a) E(X) E(Y) (b) 2E(X) E(Y) δ (w)
(a) (b)
(c) (d)
25. Time average of cross correlation function and the cross spectral density function from
________ pair
(a) Laplace Transform (b)Z-Transform (c) Fourier Transform (d) Convolution
Answer:
1.D 2.B 3.C 4.D 5.C 6.B
7. A 8.A 9.B 10.B 11.D 12.D 13.C
14.B 15.A 16.B 17.A 18.B 19.A
20.C 21.D 22.C 23.B 24.A
25.C
UNIT – VIII: NOISE
1. A TV receiver has a 4KΩ input resistance and operates in a frequency range of 54-56
MHZ. At an ambient temperature of 270C, the RMS thermal noise voltage at the i/p of the
receiver is
(a) 25μV (b) 19.9μV (c) 14.8μV (d) 22μV
2. The RMS noise voltage across a 2μF capacitor over the entire frequency band when the
capacitor is shunted by a 2KΩ resistor maintained at 3000K is
(a) 0.454μV (b) 4.54μV (c) 45.4μV (d) 0.0454μV
3. When noise is mixed with a sinusoid the amplitude and PSD of the resulting noise
component becomes & of the original respectively
(a) Same as that of original (b) Half, half
(c) Half, one-third (d) half, one-fourth
4. The available noise power per unit bandwidth at the input of an antenna with a noise
temperature of 150K, feeding into a microwave amplifier with Te = 200K is
(a) 483 x 10-23w (b) 4.83 x 10-23w (c) 48.3 x 10-23w (d) 483w
5. Two resistor with resistances R1 and R2 are connected in parallel and have physical
temperatures T1 and T2 respectively. The effective noise temperature TS of an equivalent
resistor with resistance equal to the parallel combination R1 and R2 is
6000K (output stage). Available power gain of the first stage is 10 and overall input
effective noise temperature is 1900K then the available power gain of second stage and
cascade's noise figure are respectively.
(a) 12, 1.655 (b) 1.655, 12 (c) 14,3.65 (d) 3.65, 14
0
12. In an amplifier, the first stage in a cascade of 5 stages has Te1 = 75 K and G1 = 0.5. Each
succeeding stage has an effective input noise temperature and an available power gain
that are each 1.75 times that of the stage preceding it. The cascade's effective input noise
temperature is
(a) 50.260K (b) 500.260K (c) 400.260K (d) 550.260K
13. The total available output noise power spectral density for a noisy two port network is
Ga0=
(a) ga(f)(T0 + Te) (b)
(c) (d)
14. The noise present at the input of a two port network is 1μw. The noise figure of the
network is 0.5dB and its gain is 1010. The available noise power contributed by two port
is
(a) 1.22 KW (b) 12.2KW (c) 122KW (d) 1220KW
15. For the above problem the available output noise power is
(a) 12.2 KW (b) 11.2KW (c) 122KW (d) 112 KW
16. If ga(f) is the available gain of the network these the noise figure in terms of (effective
noise temperature) and T0 is
Answer:
1.B 2.D 3.D 4.C 5.B 6.C
7. C 8.D 9.A 10.A 11.D 12.C
13.C 14.A 15.B 16.B 17.A 18.D
19.A 20.B
WEBSITES
1. http://higheredbcs.wiley.com/legacy/college/yates/0471272140/quiz_sol/quiz_sol.pdf
2. http://www.egwald.ca/statistics/
3. http://math.niu.edu/~rusin/known-math/index/60-XX.html
4. http://www.math.harvard.edu/~knill/teaching/math144_1994/probability.pdf
JOURNALS
1. An International Journal of Probability and Stochastic Processes (ISSN: 1744-2508)
2. Journal of Theoretical Probability (ISSN: 0894-9840)
3. Probability Theory and Related Fields (ISSN: 0178-8051)
4. Theory of Probability and Its Applications (ISSN: 0040-585X)
STUDENTS SEMINAR TOPICS
1. Set theory principles.
2. Examples of random variables, CDF and PDF.
3. Conditional distributions and density functions and properties.
4. Mean, variance, MGF and characteristic functions of Rayleigh random variable.
5. Interval and point conditioning of distribution and density function.
6. Unequal Distribution and Equal Distributions.
7. Linear transformation of Gaussian random variables.
8. Distinguish between random variables and random process.
9. Product Device Response to a random signal.
Average Noise Figures, Average Noise Figure of cascaded Networks