0% found this document useful (0 votes)
17 views

Mathematic For Computer Science - III: - Function - Tools

The document summarizes key concepts from a lecture on random variables and probability distributions. It discusses tools for bounding the deviation of a random variable from its expected value, including: 1. Markov's Inequality, which provides an upper bound on the probability that a nonnegative random variable exceeds a given value, in terms of the expected value. 2. Important properties of Markov's Inequality, such as only applying to nonnegative random variables, requiring the threshold value to exceed the expected value, and providing a loose bound. 3. Markov's Inequality serving as a building block for deriving other powerful inequalities like Chebyshev's Inequality.

Uploaded by

Kishan Nawal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Mathematic For Computer Science - III: - Function - Tools

The document summarizes key concepts from a lecture on random variables and probability distributions. It discusses tools for bounding the deviation of a random variable from its expected value, including: 1. Markov's Inequality, which provides an upper bound on the probability that a nonnegative random variable exceeds a given value, in terms of the expected value. 2. Important properties of Markov's Inequality, such as only applying to nonnegative random variables, requiring the threshold value to exceed the expected value, and providing a loose bound. 3. Markov's Inequality serving as a building block for deriving other powerful inequalities like Chebyshev's Inequality.

Uploaded by

Kishan Nawal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 25

Mathematic for Computer Science - III

CS203B

Lecture 11:

• Function of random variable


• Tools for bounding deviation of a random variable

1
Homework

From the last class

2
Computing a random permutation of
[𝟎, 𝒏 − 𝟏]

In an efficient manner.

3
𝑨 𝟏
𝟎 𝟎 𝟎𝟏 𝟏
𝟎 𝟎 𝟏
𝟎 𝟏
𝟎 𝟎
𝟏

0 1 … 𝑛−1
𝑆empty string;
For 𝑖 =1 to 𝑛 do
{
Repeat
Whatin
𝑋 a uniformly random number is the
the expected
range [0,no.
𝑛− of1];
calls to “random number
until 𝑨[𝑋] = 𝟎; generator” this algorithm
𝑆𝑆 ∷ 𝑋; makes ?
𝑨[𝑋] = 𝟏;
}
return 𝑆;
Homework: Prove that the algorithm indeed computes a uniformly random permutation.
4
It was a huge disappointment to find that hardly 2 − 3 students
only attempted these homework problems.

5
Homework 1

To be solved in the next class

Expectation of
a random number of random variables

6
Elevator problem

The number of people that enter an elevator at ground floor


is a Poisson random variable with mean 10.
There are 𝑁 floors above the ground floor.
Each person is equally likely to get off at any one of the 𝑁 floors independently.
Compute the expected number of stops that the elevator will make before
discharging all of its passengers.

7
Homework 2

To be solved in the next class

Expected value of
Product of independent random variables

8
𝑿 : a random variable defined over a probability space (𝛀,P).
𝒀 : a random variable defined over the same probability space.
𝑿 and 𝒀 are independent.

For all 𝑏 ∈ 𝒀, 𝑎 ∈ 𝑿 𝐏𝑿=𝑎𝒀=𝑏 ?= 𝐏[𝑿 = 𝑎]

Theorem:
𝐄 𝑿 ⋅ 𝒀 = 𝐄 𝑿 ⋅ 𝐄[𝒀]

9
Functions of random variables

10
What do these expressions mean ?

𝑿 : a random variable defined over a probability space (𝛀,P).

Random variables Value taken for elementary event 𝝎

𝑎𝑿 + 𝑏 𝑎𝑿(𝝎) + 𝑏

𝑿𝟐 (𝑿(𝝎))𝟐

𝟏 𝟏
𝑿 𝑿(𝝎)

𝒆𝑿 𝒆𝑿(𝝎)

11
Under what conditions ?

𝑿 : a random variable defined over a probability space (𝛀,P).


Let 𝑦 be any function defined on a subset of real numbers
𝑦 = 𝒇(𝑥)

𝒀=𝒇 𝑿
𝒀 can be viewed as a random variable

Range of 𝒇 ⊆ A subset of Real no.

Domain of 𝒇 ⊇ Range of 𝑿

To ensure that 𝒇to


requirement is defined for each
be a random value taken by 𝑿.
variable 12
Examples

13
𝑎𝑿 + 𝑏
𝒀 = 𝑎𝑿 + 𝑏

Question: What is 𝐄[𝒀] ?

𝐄[𝒀] = σ𝝎 𝒀(𝝎) ⋅ 𝐏(𝝎)

= σ𝝎(𝑎𝑿 𝝎 + 𝑏) ⋅ 𝐏(𝝎)

= σ𝝎(𝑎𝑿 𝝎 ⋅ 𝐏 𝝎 ) + σ𝝎(𝑏 ⋅ 𝐏(𝝎))

= 𝑎 σ𝝎 𝑿 𝝎 ⋅ 𝐏(𝝎) + 𝑏 σ𝝎 𝐏(𝝎)

=𝑎𝐄 𝑿 +𝑏
14
𝟐
𝑿
𝒀 = 𝑿𝟐

Suppose 𝑿 : Bernoulli random variable with parameter 𝒑.

Question: What is 𝐄[𝒀] ?

𝐄[𝒀] = σ𝝎 𝒀(𝝎) ⋅ 𝐏(𝝎)

2
= σ𝝎 𝑿 𝝎 ⋅ 𝐏(𝝎)

= σ𝒊∈𝑿 𝒊 2 ⋅ 𝐏 𝑿=𝒊 )

= 𝟏𝟐 ⋅ 𝐏 𝑿 = 𝟏 + 𝟎𝟐 ⋅ 𝐏 𝑿 = 𝟎

15
=𝑝
𝟐
𝑿
𝒀 = 𝑿𝟐

Suppose 𝑿 : Binomial random variable with parameters 𝒏 and 𝒑.

Question: What is 𝐄[𝒀] ?

2
𝐄[𝒀] = σ𝝎 𝑿 𝝎 ⋅ 𝐏(𝝎)

2 𝑛
= σ𝒊∈𝑿 𝒊 𝑝𝑖 𝒊⋅ )1 − 𝑝
⋅ 𝐏 𝑿⋅= 𝑛−𝑖
𝑖

𝑿 = σ𝟏≤𝒊≤𝒏 𝑿𝒊

Where 𝑿𝒊 ’s are 𝒏 independent Bernoulli random variable.


16
𝟐
𝑿
𝒀 = 𝑿𝟐

Suppose 𝑿 : Binomial random variable with parameters 𝒏 and 𝒑.

Question: What is 𝐄[𝒀] ?


Expected Product
value ofof 2 independent
Product of 2
independent Bernoulli random variables.
2
𝐄[𝒀] = σ𝝎 𝑿 𝝎 ⋅ 𝐏(𝝎) Expected value of a Bernoulli
r.v. with parameter 𝑝

2
= σ𝝎 𝑿𝟏 𝝎 + 𝑿𝟐 𝝎 + ⋯ + 𝑿𝒏 𝝎 ⋅ 𝐏 𝝎
𝟐
=෍ ෍ 𝑿𝑖 ?𝝎 ⋅𝐏 𝝎 + ෍ 𝑿𝑖 𝝎 ⋅ ?𝑿𝑗 𝝎 ⋅𝐏 𝝎
1≤𝑖≤𝑛 𝑖≠𝑗
𝝎

𝟐
= ෍ 𝑛𝑝
෍ 𝑿 𝑖 𝝎? ⋅𝐏 𝝎 +෍ ? ⋅ 𝑿𝑗 𝝎2 ⋅ 𝐏 𝝎
෍ 𝑿𝑖 𝝎
𝑛 𝑛−1 𝑝
1≤𝑖≤𝑛 𝜔 𝜔 17
𝑖≠𝑗
Homework
Suppose 𝑿 : Geometric random variable with parameter 𝒑.

What is 𝐄[𝑿𝟐 ] ?

𝟏
What is 𝐄[ ] ?
𝑿

18
How to show

P[𝑿 ≥ (𝟏 + 𝝐)𝐄[𝑿]]
P[𝑿 ≤ (𝟏 − 𝝐)𝐄[𝑿]]

19
Tools

• Markov’s Inequality

• Chebyshev’s Inequality

• Chernoff bound
Markov’s Inequality
Theorem: Suppose 𝑿 is a random variable defined over a probability space (𝛀,P)
such that 𝑿(𝝎) ≥ 0 for each 𝝎 ϵ 𝛀.
Then for any positive real number 𝒂,
𝑬[𝑿]
P(𝑿≥𝒂) ≤
𝒂

Important point 1:
• Applicable only for a nonnegative random variable.

The proof of above theorem crucially exploits this fact.

21
Markov’s Inequality
Theorem: Suppose 𝑿 is a random variable defined over a probability space (𝛀,P)
such that 𝑿(𝝎) ≥ 0 for each 𝜔 ϵ 𝛀.
Then for any positive real number 𝒂,
𝑬[𝑿]
P(𝑿≥𝒂) ≤
𝒂

Important point 2:
• Makes sense only for 𝒂 > 𝑬[𝑿].

Otherwise, probability of the above event becomes > 1 and hence useless.

22
Markov’s Inequality
Theorem: Suppose 𝑿 is a random variable defined over a probability space (𝛀,P)
such that 𝑿(𝝎) ≥ 0 for each 𝝎 ϵ 𝛀.
Then for any positive real number 𝒂,
𝑬[𝑿]
P(𝑿≥𝒂) ≤
𝒂

Important point 3:
• can’t be used for “𝑿 ≤ 𝒂” Try on your own to realize it.
Also see the proof of above theorem to get convinced.

23
Markov’s Inequality
Theorem: Suppose 𝑿 is a random variable defined over a probability space (𝛀,P)
such that 𝑿(𝜔) ≥ 0 for each 𝜔 ϵ 𝛀.
Then for any positive real number 𝒂,
𝑬[𝑿]
P(𝑿≥𝒂) ≤
𝒂

Important point 4:
• gives very loose bound. so not useful most of the times .

Suppose 𝑿 : No. of heads in tossing a fair coin 𝒏 times


𝟑 𝟐
P(𝑿≥ 𝟒 𝒏) ≤
𝟑

But, it plays a key role in deriving other powerful inequalities,


24
like Chebyshev Inequaity and Chernoff bound .
Markov’s Inequality
Theorem: Suppose 𝑿 is a random variable defined over a probability space (𝛀,P)
such that 𝑿(𝝎) ≥ 0 for each 𝝎 ϵ 𝛀.
Then for any positive real number 𝒂,
𝑬[𝑿]
P(𝑿≥𝒂) ≤
𝒂

Proof: 𝐄[𝑿] = σ𝑡∈𝑿 𝑡 ⋅ 𝐏[𝑿 = 𝑡]


= σ𝑡∈𝑿,𝑡<𝒂 𝑡 ⋅ 𝐏[𝑿 = 𝑡] + σ𝑡∈𝑿,𝑡≥𝒂 𝑡 ⋅ 𝐏[𝑿 = 𝑡]
≥ σ𝑡∈𝑿,𝑡≥𝒂 𝑡 ⋅ 𝐏[𝑿 = 𝑡] Since 𝑿(𝝎) ≥ 0
≥ σ𝑡∈𝑿,𝑡≥𝒂 𝒂 ⋅ 𝐏[𝑿 = 𝑡]
= 𝒂 σ𝑡∈𝑿,𝑡≥𝒂 𝐏[𝑿 = 𝑡]
= 𝒂 𝐏(𝑿≥𝒂)

𝒂 𝐏(𝑿≥𝒂)

Try to internalize this proof fully. 25

You might also like