0% found this document useful (0 votes)
17 views28 pages

Lecture 10

The document is a lecture on Probability and Statistics, focusing on the concepts of Expected Value, Variance, and Covariance of random variables. It includes examples and solutions to illustrate these concepts, emphasizing the importance of understanding both the mean and variability in probability distributions. The lecture also discusses the relationship between two random variables through covariance.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views28 pages

Lecture 10

The document is a lecture on Probability and Statistics, focusing on the concepts of Expected Value, Variance, and Covariance of random variables. It includes examples and solutions to illustrate these concepts, emphasizing the importance of understanding both the mean and variability in probability distributions. The lecture also discusses the relationship between two random variables through covariance.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Probability and Statistics

CEC217

Lecture

Dr. Tarık Adnan


Email: [email protected]
Office: Kat.1, # 104 Solutions will be discussed during the class
Expanding the definition of Expected Value

2
Cont.
• let us consider a new random variable 𝑔(𝑋), which depends on
𝑋; that is, each value of 𝑔(𝑋) is determined by the value of 𝑋.
For instance, 𝑔(𝑋) might be 𝑋 2 or 3𝑋 − 1, and whenever 𝑋
assumes the value 2, this means 𝑔(𝑋) assumes the value 𝑔(2).
In particular, if 𝑋 is a discrete random variable with probability
distribution 𝑓(𝑥), for 𝑥 = −1, 0, 1, 2, and 𝑔(𝑋) = 𝑋 2 , then:

𝑔 −1 = −12 = 1, 𝑔 0 = 02 = 0, 𝑔 1 = 12 = 1, 𝑔 2 = 22 = 4,

3
Cont.
• So the probability distribution of 𝑔(𝑋) may be written

• By the definition of the expected value of a random


variable, we obtain

4
Example
• Suppose that the number of cars 𝑋 that pass through a car
wash between 4:00 P.M. and 5:00 P.M. on any sunny Friday
has the following probability distribution

• Let 𝑔(𝑋) = 2𝑋 − 1 represent the amount of money, in dollars,


paid to the worker by the manager. Find the worker’s expected
earnings for this particular time period.

5
Solution
• The worker can expect to receive

6
Example
• Let 𝑋 be a random variable with density function

• Find the expected value of 𝑔(𝑋) = 4𝑋 + 3.

Solution?:

7
Mean (Expected Value) in Joint Distribution

8
Example
• Let 𝑋 and 𝑌 be the random variables with joint probability
distribution indicated in the table below ( the table taken from
Lecture 9, slide 10). Find the expected value of 𝑔(𝑋, 𝑌) = 𝑋𝑌.

Solution?
9
Solution

10
Example
• Let the joint density function 𝑓 𝑥, 𝑦 be:

Y
and 𝑔 𝑋, 𝑌 =
X

Find the expected value 𝐸[𝑔 𝑋, 𝑌 ]

Solution?

11
Solution
• Since 𝑋 and 𝑌 are continuous random variables, therefore:

12
Variance and Covariance of Random
Variables

• The mathematical expectation or the mean, or expected


value, of a random variable 𝑋 is of special importance in
statistics because it describes where the probability
distribution is centered.
• By itself, however, the mean does not give an enough
description of the shape of the distribution. We also need to
characterize the variability in the distribution.

13
Figure 4.1 Distributions with equal means and unequal
Cont. dispersions

• In Figure 4.1, we have the histograms of two discrete


probability distributions that have the same mean, 𝜇 = 2, but
differ considerably in variability, or the dispersion of their
observations about the mean.
14
Cont.
• The most important measure of variability of a random
variable 𝑋 is obtained by applying a theorem which we
2
have seen before (Theorem 4.1), with 𝑔(𝑋) = 𝑋 − 𝜇 .

Theorem 4.1

15
The Variance
• The quantity is referred to as the variance of the random
variable 𝑋 or the variance of the probability distribution of
𝑋 and is denoted by 𝑉𝑎𝑟(𝑋) or the symbol 𝜎𝑋2 , or simply by 𝜎 2
when it is clear to which random variable we refer
Definition 4.3

• The quantity 𝑥 − 𝜇 in Definition 4.3 is called the deviation of


an observation from its mean 16
Example
• Let the random variable 𝑋 represent the number of
automobiles that are used for official business purposes on
any given workday. The probability distribution 𝑓(𝑥) for
company A [Figure 4.1(a)] is: Figure 4.1

and that for company B [Figure 4.1(b)] is

• Show that the variance of the probability distribution for company


B is greater than that for company A. 17
Solution
• For company A, we find that
- Mean or the
expected value:
𝜇𝐴 = 𝐸(𝑋) = (1)(0.3) + (2)(0.4) + (3)(0.3) = 2.0

and then
3

𝜎𝐴2 = ෍ 𝑥 − 2 2 𝑓(𝑥)
𝑥=1
= 1−2 2 0.3 + 2 − 2 2 0.4 + 3 − 2 2 0.3 = 0.6

• For company B, we have

𝜇𝐵 = 𝐸 𝑋
= (0)(0.2) + (1)(0.1) + (2)(0.3) + (3)(0.3) + (4)(0.1) = 2.0

and then:
18
Cont.
𝑥

𝜎𝐵2 = ෍ 𝑥 − 2 2 𝑓(𝑥)
𝑥=0
= 0 − 2 2 (0.2) + 1 − 2 2 (0.1) + 2 − 2 2 (0.3)
+ 3 − 2 2 (0.3) + 4 − 2 2 (0.1) = 1.6

• Clearly, the variance of the number of automobiles that are


used for official business purposes is greater for company
B than for company A

19
• An alternative and preferred formula for finding 𝜎 2 , which
often simplifies the calculations, is stated in the following
theorem.

Example
• Let the random variable 𝑋 represent the number of defective
parts for a machine when 3 parts are sampled from a production
line and tested. The following is the probability distribution of 𝑋.

, Find the variance 𝜎 2


20
Solution

• We need first to obtain 𝜇 and 𝐸(𝑋 2 ):

21
Notice that:
• The variance or standard deviation has meaning only when
we compare two or more distributions that have the same
units of measurement.
• Therefore, we could compare the variances of the
distributions of contents, measured in liters, of bottles of
orange juice from two companies, and the larger value
would indicate the company whose product was more
variable or less uniform.
• It would not be meaningful to compare the variance of a
distribution of heights to the variance of a distribution of
ability scores (because they are not unified in units) 22
Expanding the definition of Variance
• Let us now extend the concept of the variance of a random
variable 𝑋 to include random variables related to 𝑋.
• For the random variable 𝑔(𝑋), the variance is denoted by

𝜎𝑔2 𝑋 and is calculated by means of the following theorem:

23
Example
• Calculate the variance of 𝑔(𝑋) = 2𝑋 + 3, where 𝑋 is a random
variable with probability distribution:

Solution
• The mean of the random variable 𝑔 𝑋 = 2𝑋 + 3
3

𝜇2𝑋+3 = 𝐸 2𝑋 + 3 = ෍ 2𝑥 + 3 × 𝑓 𝑥 = 6
𝑥=0
2
Now: 𝜎𝑔(𝑋) 2
= 𝜎2𝑥+3 = 𝐸{ 2𝑋 + 3 − 𝜇2𝑋+3 2 } = 𝐸{(2𝑋 + 3 − 6)2 }
3

= 𝐸 2X − 3 2 = 𝐸 4𝑋 2 − 12𝑋 + 9 = ෍ 4𝑥 2 − 12𝑥 + 9 × 𝑓 𝑥 = 4
𝑥=0
• In case of continuous random variable (density function): ෍ න 24
Covariance
• The covariance term describes the linear relationship between
two random variables. Therefore, if a covariance between 𝑋
and 𝑌 is zero, 𝑋 and 𝑌 may have a nonlinear relationship,
which means that they are not necessarily independent.

• The covariance between two random variables is a measure


of the nature of the association between the two variables.

𝜇𝑥 = ෍ 𝑥𝑔(𝑥)

25
Example
• Considering a previous example when we randomly select two
ballpoint pens from certain box
• 𝑋 was the number of blue pens
• 𝑌 represented the number of red pens.
The following is the joint probability distribution

Find the covariance of 𝑋 and 𝑌 26


Solution: We need 𝐸 𝑋𝑌 , 𝜇𝑥 , and 𝜇𝑦 . First, we calculate 𝐸(𝑋𝑌)

• Now finding 𝜇𝑥 and 𝜇𝑦 :

Therefore,
• You can see in your first book, after this example, how covariance is calculated for the
continuous variable type 27
Thank you
Feel free to ask questions

28 28

You might also like