0% found this document useful (0 votes)
32 views33 pages

Lecture 1 20240304v5

The document discusses random processes and probability theory. It covers several methods for calculating pi, including geometric methods, infinite series, probabilistic methods using Monte Carlo simulations, and computational algorithms. It also provides examples of problems involving randomness in electrical engineering applications.

Uploaded by

Reedus
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views33 pages

Lecture 1 20240304v5

The document discusses random processes and probability theory. It covers several methods for calculating pi, including geometric methods, infinite series, probabilistic methods using Monte Carlo simulations, and computational algorithms. It also provides examples of problems involving randomness in electrical engineering applications.

Uploaded by

Reedus
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Special Topics in

Communication
Networks
Lecture 1
Why is the
understanding of
randomness important?

2
Why we should study the probability?
How can we find 𝜋𝜋?
1. Geometric Methods
• Circumscribed and Inscribed Polygons: Archimedes used this method
with a 96-sided polygon to estimate 𝜋𝜋 as being between 3.1408 and 3.1429.
2. Infinite Series
1 1 1 1
• Leibniz Formula for 𝜋𝜋: 𝜋𝜋 = 4 × 1 − + − + − ⋯ simple, yet slow-converging series
3 5 7 9
4 4 4
• Nilakantha's Series: 𝜋𝜋 = 4 × 3 + − + … A faster-converging series from the
2×3×4 4×5×6 6×7×8
15th century
3. Probabilistic Methods
• Monte Carlo Simulations: Randomly generating points, the ratio of points
that fall inside the quarter circle to the total number of points
4. Analytical Methods
• Buffon's Needle: This is a probability-based method involving dropping needles on a surface
marked with parallel lines and observing the number of times the needles cross the lines.
5.Computational Algorithms
• Machin-like Formulas: These are formulas that express 𝜋𝜋 as𝜋𝜋 sums of arctangents,
𝜋𝜋
which
1
can be 1
computed to high precision using Taylor series expansions. = 𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎𝑎, = 4 atan + atan
4 4 5 239
• BBP (Bailey-Borwein-Plouffe) Formula: This formula allows for the calculation of the nth binary
digit of π without needing to calculate the preceding n−1digits, facilitating high-precision
calculations of π on computers.
6.Using Built-in Mathematical Functions
Buffon's Needle

• Suppose we have a floor made of parallel strips of wood,


each the same width, and we drop a needle onto the floor.
What is the probability that the needle will lie across a line
between two strips?
Solution: Buffon's Needle
• Let 𝑥𝑥 be the distance from the center of the needle to the closest parallel line, and let
𝜃𝜃 be the acute angle between the needle and one of the parallel lines.
𝑡𝑡
• The uniform probability density function (PDF) of 𝑥𝑥 between 0 and is
2
2 𝑡𝑡
,0 ≤ 𝑥𝑥 ≤ 2
• 𝑓𝑓𝑋𝑋 𝑥𝑥 = � 𝑡𝑡
0, 𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜
𝜋𝜋
• The uniform probability density function of θ between 0 and is
2
2 𝜋𝜋
,0 ≤ 𝑥𝑥 ≤
• 𝑓𝑓Θ 𝜃𝜃 = � 𝜋𝜋 2
0, 𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜
• The two random variables, x and θ, are independent,[4] so the joint probability density
function is the product
4 𝑡𝑡 𝜋𝜋
, 0 ≤ 𝑥𝑥 ≤ 2 , 0 ≤ 𝑥𝑥 ≤
• 𝑓𝑓𝑋𝑋,Θ 𝑥𝑥, 𝜃𝜃 = �𝑡𝑡𝑡𝑡 2
0, 𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜
𝑙𝑙
• The needle crosses a line if 𝑥𝑥 ≤ sin 𝜃𝜃
𝜋𝜋
2
𝑙𝑙
sin 𝜃𝜃 4 2 𝑙𝑙
• When 𝑙𝑙 ≤ 𝑡𝑡, 𝑝𝑝 = ∫𝜃𝜃=0 ∫𝑥𝑥=0 2 2 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 = ⋅
𝑡𝑡𝑡𝑡 𝜋𝜋 𝑡𝑡
• How about 𝑙𝑙 > 𝑡𝑡?
How to define a capacity of trunk lines?

• Blocked calls cleared


• Calls arrive as determined by a
Poisson process
• The duration of the time that a
user occupies a channel is
exponentially distributed
• How many users can be
supported for 2% blocking
probability for ten of trunked
channels in a blocked calls
cleared system?
• How many users can be
supported 2% blocking
probability for ten of trunked
channels with an average
calling time of 2 mins in an
hour?
Bertrand Paradox

•Consider an equilateral triangle


inscribed in a circle. Suppose a
chord of the circle is chosen at
random. What is the probability
that the chord is longer than a
side of the triangle?

1. A third

2. A half

3. A quarter

8
Solution 1: Bertrand Paradox
•Choose a point on the
circumference and rotate the
triangle so that the point is at one
vertex.

•Choose another point on the circle


and draw the chord joining it to the
first point.

•For points on the arc between the


endpoints of the side opposite the
first point, the chord is longer than
a side of the triangle. π/3

•The length of the arc is one third of


the circumference of the circle,
therefore the probability a random
chord is longer than a side of the
inscribed triangle is one third.

9
Solution 2: Bertrand Paradox
•Choose a radius of the circle and
rotate the triangle so a side is
perpendicular to the radius.

•Choose a point on the radius and


construct the chord whose midpoint is
the chosen point.

•The chord is longer than a side of the


triangle if the chosen point is nearer
the center of the circle than the point
where the side of the triangle
intersects the radius.
1/2
•Since the side of the triangle bisects
the radius, it is equally probable that
the chosen point is nearer or farther.
1/2
•Therefore the probability a random
chord is longer than a side of the
inscribed triangle is one half.

10
Solution 3: Bertrand Paradox

•Choose a point anywhere within the


circle and construct a chord with the
chosen point as its midpoint.

•The chord is longer than a side of


the inscribed triangle if the chosen
point falls within a concentric circle 1/2
of radius 1/2.

1
•The area of the smaller circle is one
fourth the area of the larger circle,
therefore the probability a random
chord is longer than a side of the
inscribed triangle is one fourth.

11
Our Interest and Goals

• Study special topics in communications regarding


probabilities and stochastic processes
• Reinforced learning
• Queueing, etc.

• Study related tools to characterize non-deterministic signals


• Random processes, statistics

• Study tools to characterize uncertainty


• Probability theory, random events, random variables
Random versus Deterministic

25 1.2

1
20

0.8

15
0.6

0.4
10

0.2

5
0

0 -0.2
0 2 4 6 8 10 0 2 4 6 8 10

Random Noise
Y=2X+1
Randomness around us

• At bus stop
• Waiting time for bus
• The number of people to take a bus

• At a bank
• Waiting time for service
• Service time
• The number of customers to wait for

• Bet/Game
• Toss a coin: Head or Tail
• Roll a dice: 1, 2, 3, 4, 5, 6

• Shooting game
Typical Electrical Engineering Problems
(1/3)
• A noisy binary communication channel
• The channel can be twisted pair, coaxial cable, fiber optic cable, or
wireless medium.
• The channel introduces noise and thereby bit errors.

h(n)
x(n) y(n)
Station A Channel Station B

00110001010... 00100001010...
Typical Electrical Engineering Problems
(2/3)
• Desired target signal is buried in noise.
𝑥𝑥(𝑡𝑡) = 𝐴𝐴(𝑡𝑡) cos( 𝜔𝜔𝜔𝜔 + 𝜑𝜑(𝑡𝑡)) + 𝑛𝑛(𝑡𝑡)

signal signal noise


• Determine the presence or absence of the desired signal.

• Filter the signal out of noise.

• Demodulate the signal.


1

0.8 3

0.6 2

0.4
1

Signal waveform
0.2
0
0
-1
-0.2

-0.4 -2

-0.6 -3

-0.8
-4
-1
0 1 2 3 4 5 6

A(t)cos(ωt+φ(t)) A(t)cos(ωt+φ(t))+n(t)
1 2 3 4 5 6
-3
x 10 Time [sec] -3
x 10
Typical Electrical Engineering Problems
(3/3)
• In large computer networks, there are limited resources
(e.g., bandwidth, routers, switches, printers and other devices)
that need to be shared by the users.
• User jobs/packets are queued and assigned service based on
predefined criteria.
• Demand is uncertain and service time is also uncertain.
• Delay from the time the service is requested to the time it is
completed is uncertain.

• Similar considerations exist for telephone networks, multiuser


computer networks, and other communication networks.
Random Variables and Random
Processes
• We will study them details later
Mapping X (s) = x

s real number line


x

Sample space

X (s, t)

Space of time
functions
Probability Space

• A probability space (S, F(S), P) is a triple made up of three


elements
1. S: Sample space
2. F(S): A collection of sets S, event space
3. The probabilities of P(E) for each E∈F(S)
P(.): F(S)  [0, 1]
Sample Space (S, F(S), P)

• Experimental outcomes are unpredictable

• The set of all possible outcomes is known

• The sample space S of an experiment is defined as the set of all


possible outcomes of the experiment.
• The flipping of a coin: S = {H, T}
• The rolling of a die: S = {1,2,3,4,5,6}
• The flipping of two coins: S={(H,H),(H,T),(T,H),(T,T)}
• The lifetime of a car: S=[0,∞)

• An Event is any subset E of the sample space S (E⊂S or E∈F(S))


• E = {H}, E = {1}, E = {1,3,5} E ={(H,H),(T,T)}
Basic Set Theory Definitions

• The set containing all possible elements of interest is called the


universe, universal set or space S.

• The set containing no elements is called the empty set or null set.

• For any two events E and F⊂S, we define the event


• E∪F: the event E∪F will occur if either E or F (the union of E and F)
• E∩F (or EF): the event EF consists of all outcomes which are both in E and in F
• E={1,3,5},F={1,2,3}, E∪F=? E∩F=?
• If EF = 0, then E and F are said to be mutually exclusive.
• For more

than tow events, we can define unions and intersections such as ⋃∞
𝑛𝑛=1 𝐸𝐸𝑛𝑛
and ⋂𝑛𝑛=1 𝐸𝐸𝑛𝑛 , where Ens are events

• Ec : the complement of E, Ec will occur if and only if E does not occur


• S = {1,2,3,4,5,6}, E={1,3,5}, Ec={2,4,6}, Sc = 0 or {}

• Two sets E and F are equal if they contain exactly the same elements.
Event Space (S, F(S), P)

• Intuitively, is a collection of events which we are interested in


computing the probability of.

• Mathematically, is a family of subset of sample space S


closed under certain set operations as below:
1. If E∈F(S), then Ec∈F(S)
2. If E1, E2∈F(S), then E1∪E2 ∈F(S)
3. If E1, E2, E3 … ∈F(S), then ∪i={1,…,∞}Ei ∈F(S)
• A family of sets satisfying these three properties is called a σ-field
Probabilities Defined on Events (S, F(S),
P)
• For each event E of a sample space S, we assume that a
number P(E) is defined and satisfies the following three
conditions:
1. 0≤P(E)≤1
2. P(S)=1
3. For any sequence of events E1,E2,… that are mutually exclusive,
that is , events for which EnEm=0 when n≠m, then
𝑝𝑝 ⋃∞ ∞
𝑛𝑛=1 𝐸𝐸𝑛𝑛 = ∑𝑛𝑛=1 𝑃𝑃(𝐸𝐸𝑛𝑛 )

• We refer to P(E) as the probability of the event E.


• E.g) the coin flipping P({H})=P({T}) = ½
• The die rolling:
P({1})= P({2})= P({3})= P({4})= P({5})= P({6})=1/6
P({1,3,5})= P({1})+P({3})+P({5})= 1/2
Properties of Probability (1/2)

• P(S)=1
• P(E)+P(Ec)=P(S)
• P(E) =1-P(Ec)
• P(Ø)=0
• If EF = Ø, then P(EF) = 0

• P(E∪F)=P(E)+P(F)-P(EF)
• If E⊂F, then P(E)≤P(F)

• P(E∪F∪G)
=P(E)+P(F)-P(EF)+P(G)-P(EG∪FG)
=P(E)+P(F)-P(EF)+P(G)-P(EG)-P(FG)+P(EFG)
=P(E)+P(F)+P(G)-P(EF) -P(EG)-P(FG)+P(EFG)

• P(E1∪E2∪E3∪…∪En)
=∑iP(Ei)-∑i<jP(EiEj)+∑i<j<kP(EiEjEk)+…+(-1)n+1P(E1E2…En)
The Principle of Total Probability
Let A1, A2, …, An be a set of mutually exclusive and collectively
exhaustive events:
𝐴𝐴𝑘𝑘 𝐴𝐴𝑗𝑗 = 0 𝑘𝑘 ≠ 𝑗𝑗

⋃𝑛𝑛𝑗𝑗=1 𝐴𝐴𝑗𝑗 = 𝑆𝑆 then ∑𝑛𝑛𝑗𝑗=1 Pr[ A𝑗𝑗 ] = 1

Now let B be any event in S. Then,

Pr[B] = Pr[BA1] + Pr[BA2] + A1 An


B
… + Pr[BAn]
A2
Conditional Probability

• If we let E and F denote the envents, then the probability


P(E|F) is the conditional probability that E occurs given that
F has occurred
• Because we know that F has occurred, it follows that F becomes our
new sample space and hence the event EF occurs will equal the
probability of EF relative to the probability of F. That is
P(E|F) = P(EF)/P(F) (1.5)

• Ex) A family has two children. What is the conditional


probability that boys given that at least one of them is a boy?
Assume that the sample space S is given by
S={(b,b),(b,g),(g,b),(g,g)}, and all outcomes are equally likely
• P(B|A)=P(BA)/P(A)=P({(b,b)})/P({(b,b),(b,g),(g,b)})
=(1/4)/(3/4)=1/3
Independent Events

• Two events E and F are said to independent if P(EF)=P(E)P(F)

• By (1.5) this implies that E and F are independent if P(E|F)=P(E)


• P(E|F)=P(EF)/P(F)=P(E)P(F)/P(F)=P(E)

• Two event E and F that are not independent are said to be dependent

• The definition of independence can be extended to more than two


events. The events E1, E2, …, En are said to be independent if for every
subset E1, E2, …, Er r≤n of these events
P(E1E2…Er)=P(E1)P(E2)P(Er)

• Ex) Let E={1,2}, F={1,3}, G={1,4}, P({1})=P({2})=P({3})=P({4})


• P(EF)=P(E)P(F) = ¼
• P(EG)=P(E)P(G) = ¼
• P(FG)=P(F)P(G) = ¼ (Mutually independent)
• However, ¼=P(EFG)≠P(E)P(F)P(G)=1/8. Hence, the events EFG are not jointly
independent.
Bayes’ Formula

• Suppose that F1, F2, …, Fn are mutually exclusive events such


𝑛𝑛
that 𝑆𝑆 = ⋃𝑖𝑖=1 𝐹𝐹𝑖𝑖 , then
• 𝐸𝐸 = 𝐸𝐸𝐸𝐸 = 𝐸𝐸 ⋃𝑛𝑛𝑖𝑖=1 𝐹𝐹𝑖𝑖 = ⋃𝑛𝑛𝑖𝑖=1 𝐸𝐸𝐹𝐹𝑖𝑖
• 𝑃𝑃 𝐸𝐸 = ∑𝑛𝑛𝑖𝑖=1 𝑃𝑃 𝐸𝐸𝐹𝐹𝑖𝑖
= ∑𝑛𝑛𝑖𝑖=1 𝑃𝑃(𝐸𝐸|𝐹𝐹𝑖𝑖 ) 𝑃𝑃(𝐹𝐹𝑖𝑖 ) from (1.5) (1.8)
𝑃𝑃 𝐸𝐸 𝐹𝐹
• 𝑃𝑃 𝐹𝐹𝑖𝑖 |𝐸𝐸 = 𝑃𝑃(𝐸𝐸)𝑖𝑖 from (1.5)
𝑃𝑃 𝐸𝐸 𝐹𝐹𝑖𝑖 𝑃𝑃(𝐹𝐹𝑖𝑖 )
=∑𝑛𝑛 𝑃𝑃 𝐸𝐸𝐹𝐹 from (1.8) (1.9)
𝑖𝑖=1 𝑖𝑖 𝑃𝑃(𝐹𝐹 )
𝑖𝑖

• Equation (1.9) is known as Bayes’ Formula

• Ex) F∪Fc=S
• P(E)=P(ES)=P(E(F∪Fc))=P(EF∪EFc)=P(EF)+P(EFc)
=P(E|F)P(F)+P(E|Fc) P(Fc)=P(E|F)P(F)+P(E|Fc) (1-P(F))
• P(F|E)=P(E|F)P(F)/P(E)=P(E|F)P(F)/(P(E|F)P(F)+P(E|Fc) P(Fc))
Example of Bayes’Formula

• Ex. 1.13] In answering a question on a multiple-choice test a


student either knows the answer or guesses. Let p be the
probability that she knows the answer and 1−p the probability that
she guesses. Assume that a student who guesses at the answer will
be correct with probability 1/m, where m is the number of multiple-
choice alternatives. What is the conditional probability that a
student knew the answer to a question given that she answered it
correctly?
• Let C and K denote respectively the event that the student answers the
question correctly and the event that she actually knows the answer.
Now
• P(K|C) = P(KC)/P(C)
= P(C|K)P(K)/(P(C|K)P(K) + P(C|Kc)P(Kc))
= p/(p + (1/m)(1 − p))
= mp/(1 + (m − 1)p)
• Thus, for example, if m = 5, p = ½, then the probability that a student
knew the answer to a question she correctly answered is 5/6.
Example 1 of Probability

• Consider an experiment with a coin and a die. We define


event A as an outcome with “head” of the coin and an even
number of the die. Even B is defined as an outcome with a
number greater than 3. Find the following answers
• P(A)

• P(B)

• P(AB)

• P(A|B)

• P(B|A)
Answer: Example 1

• P(A)=3/12=1/4

• P(B)=6/12=1/2

• AB = {H4, H6}, P(AB)=2/12=1/6

• P(A|B)=P(AB)/P(B)=1/3

• P(B|A)=P(AB)/P(B)=2/3
Example 2 of Probability

• Let A1 (A0) and B1 (B0) be the event that 1 (0) is sent and the event
that 1(0) is received, respectively.
• Assumption
• P(A0) = 0.8, P(A1) = 1- P(A0) = 0.2,
• The probability of error, i.e., p=P(B1|A0)= P(B0|A1), is 0.1

0 0.9 0
0.1
0.1
1 0.9 1

• Find
• The error probability at the receiver
• The probability that 1 is sent when the receiver decides 1.
Answer: Example 2

• Parameters
• P(A0) = 0.8, P(A1) = 1- P(A0) = 0.2, P(B1|A0)= P(B0|A1) = 0.1

• The error probability at the receiver


• P(error)=P(𝐴𝐴0 𝐵𝐵1 ) + P(𝐴𝐴1 𝐵𝐵0 )=P(𝐵𝐵1 |𝐴𝐴0 )P(𝐴𝐴0 ) + P(𝐵𝐵0 |𝐴𝐴1 )P(𝐴𝐴1 ) = 0.1

• The probability that 1 is sent when the receiver decides 1.


𝑃𝑃 𝐵𝐵 𝐴𝐴𝑚𝑚 𝑃𝑃(𝐴𝐴𝑚𝑚 ) 𝐵𝐵 𝐴𝐴𝑚𝑚 𝑃𝑃(𝐴𝐴𝑚𝑚 )
𝑃𝑃
• P(𝐴𝐴𝑚𝑚 𝐵𝐵 = = ∑𝑁𝑁
𝑃𝑃 𝐵𝐵 𝑛𝑛=1 𝑃𝑃 𝐵𝐵 𝐴𝐴𝑛𝑛 𝑃𝑃(𝐴𝐴𝑛𝑛 )
𝑃𝑃 𝐵𝐵1 𝐴𝐴1 𝑃𝑃(𝐴𝐴1 ) 𝑃𝑃 𝐵𝐵1 𝐴𝐴1 𝑃𝑃(𝐴𝐴1 ) 0.9×0.2
• P(𝐴𝐴1 𝐵𝐵1 = 𝑃𝑃 𝐵𝐵1
= 𝑃𝑃 𝐵𝐵1 𝐴𝐴1 𝑃𝑃 𝐴𝐴1 +𝑃𝑃 𝐵𝐵1 𝐴𝐴0 𝑃𝑃(𝐴𝐴0
= =
) 0.9×0.2+0.1×0.8
0.69

You might also like