Random Variables-2
Random Variables-2
• Definition of probability, terminology, discrete and continuous events and sample spaces,
• Using set theory for understanding of probability, Probability axioms, discrete probability
law, discrete uniform probability law, properties of probability laws, Pseudorandom
numbers, simulated experiments models and reality
• Conditional probability, its definition and examples, multiplication rule, total probability
theorem, Bayes’ rule, Independence, Counting
• Random variables, PMF, mean and variance of random variables, common random
variables,
• Probabilistic models often involve several random variables of interest. For example, in a
medical diagnosis context, the results of several tests may be significant. Due to
possibility of false positive (and false negative) result of the test, the results are modeled
as random variables.
• All of these random variables are associated with the same experiment, sample space,
and probability law, and their values may relate in interesting ways. This motivates us to
consider probabilities involving simultaneously the numerical values of several random
variables and to investigate their mutual couplings.
• So, we’ll see joint PMFs of random variables, their expectance, independence and so on!
PAM-266 Probability and Random Variables Lecture: Random Variables 3
Random Variables
Joint PMF
• In a group of 60 people, the numbers who do or do not smoke and do or do not have
cancer are reported as shown in the Table:
• Let Ω be the sample space consisting of these 60 people. A person is chosen at random
from the group. Let C(ω) = 1 if this person has cancer and 0 if not, and S(ω) = 1 if this
person smokes and 0 if not. Let’s calculate joint PMF of {C, S}.
PAM-266 Probability and Random Variables Lecture: Random Variables 4
Random Variables
Joint PMF
• A randomly chosen person is a smoker but does not have cancer P(C = 0, S = 1) ?
Marginal PMF
PAM-266 Probability and Random Variables Lecture: Random Variables 5
Random Variables
Joint PMF
• Generalizing the discussion, Consider two discrete random variables X and Y associated
with the same experiment. The joint PMF of X and Y is defined by:
• The joint PMF determines the probability of any event that can be specified in terms of the
random variables X and Y . For example, if A is the set of all pairs (x, y) that have a
certain property, then:
Joint PMF
• We can calculate the PMFs of X and Y (also know as marginal PMFs) using:
• The joint PMF of three random variables X, Y , and Z is defined in analogy with the above
as for all possible triplets of numerical values (x, y, z):
Example
• Consider two random variables. X and Y, described by the joint PMF shown:
• Find PMF of Z?
• Calculate E(Z)?
Conditioning
• Conditional probabilities are like ordinary probabilities (satisfy the three axioms) except
that they refer to a new universe in which event is known to have occurred.
• The conditional PMF of a random variable X, conditioned on a particular event A with P(A)
> 0, is defined by:
• Note that the events {X = x} ∩ A are disjoint for different values of x, their union is A, and,
therefore,
Conditioning
Example: A population of college students have heights H and weights W which are
grouped into ranges as shown in Table.
Example (Cont’d.)
Example (Cont’d.):
(a.) weight in the range 130-160 lbs, (b.) weight is in the range of 130-160 lbs., given that
the student has height less than 6’, (c.) weight is in the range 130-160 lbs., given that
he/she has a height greater than 6’.
Independence
• The independence of a random variable from an event is similar to the independence of two
events. The idea is that knowing the occurrence of the conditioning event provides no new
information on the value of the random variable.
which is the same as requiring that the two events {X = x} and A be independent, for any choice x.
Independence
• The independence of a random variable from an event is similar to the independence of two
events. The idea is that knowing the occurrence of the conditioning event provides no new
information on the value of the random variable.
which is the same as requiring that the two events {X = x} and A be independent, for any choice x.
Independence
Independence
• The notion of independence of two random variables is similar to the independence of a random
variable from an event. We say that two random variables X and Y are independent if
• Alternatively,
Independence
• The notion of independence of two random variables is similar to the independence of a random
variable from an event. We say that two random variables X and Y are independent if
• Alternatively,
• Intuitively, independence means that the value of Y provides no information on the value of X.
Independence
• There is a similar notion of conditional independence of two random variables, given an event A
with P(A) > 0.
• The conditioning event A defines a new universe, and all probabilities (or PMFs) have to be
replaced by their conditional counterparts. For example, X and Y are said to be conditionally
independent, given a positive probability event A, if
Independence
• This is equivalent to
• If X and Y are independent random variables, and for any functions g and h, the random variables g(X)
• Note: Conditional independence may not imply unconditional independence and vice versa.
PAM-266 Probability and Random Variables Lecture: Random Variables 23
Random Variables
Practice
• A stock market trader buys 100 shares of stock A and 200 shares of stock B. Let X and Y be the
price changes of A and B, respectively over a certain time period. Assume that the joint PMF of X
and Y is uniform over the set of integers x and y satisfying
(a) Find the joint and marginal PMFs and the means of X and Y.
Practice
• Professor May B. Right often has her facts wrong, and answers each of her students' questions
incorrectly with probability 1/4, independent of other questions. In each lecture, May is asked 0, 1 ,
or 2 questions with equal probability 1/3. Let X and Y be the number of questions May is asked
and the number of questions she answers wrong in a given lecture, respectively.
a) Construct the joint PMF pX,Y (x, y) [Calculate the probability P(X = x, Y = y) for all
combinations of values of x and y.]
Practice
• (a). A fair coin is tossed repeatedly and independently until two consecutive heads or two
consecutive tails appear. Find the PMF, the expected value, and the variance of the
number of tosses.
• (b). Assume now that the coin is tossed until we obtain a tail that is immediately preceded
by a head. Find the PMF and the expected value of the number of tosses.
Assignment
• Solve all examples and end problems of chapter 2 of book (1A. Introduction to Probability)
• Solve and submit handwritten reports of following questions: (you may need to visit chapter 1
for solution of certain questions. Before exploring online resources, try yourself!
1. An urn contains five red, three orange, and two blue balls. Two balls are randomly selected.
What is the sample space of this experiment? Let X represent the number of orange balls
selected. What are the possible values of X? Calculate P{X = 0}.
2. Let X represent the difference between the number of heads and the number of tails
obtained when a coin is tossed n times. What are the possible values of X?
PAM-266 Probability and Random Variables Lecture: Random Variables 27
Random Variables
Assignment (Cont’d.)
4. Suppose a die is rolled twice. What are the possible values that the following random
variables can take on?
(a) The maximum value to appear in the two rolls. (b) The minimum value to appear in the
two rolls. (c) The sum of the two rolls. (d) The value of the first roll minus the value of the
second roll.
5. If the die in above Exercise is assumed fair, calculate the probabilities associated with
the random variables in (i)–(iv).
Assignment (Cont’d.)
6. Suppose five fair coins are tossed. Let E be the event that all coins land heads. Define
the random variable IE
For what outcomes in the original sample space does IE equal 1? What is P{IE = 1}?
7. Suppose a coin having probability 0.7 of coming up heads is tossed three times. Let X
denote the number of heads that appear in the three tosses. Determine the probability
mass function of X.
Assignment (Cont’d.)
8. Suppose three fair dice are rolled. What is the probability at most one six appears?
9. On a multiple-choice exam with three possible answers for each of the five questions,
what is the probability that a student would get four or more correct answers just by
guessing?
10. Suppose X has a binomial distribution with parameters 6 and 1 2. Show that X = 3 is the
most likely outcome.
Next activity:
Assignment