0% found this document useful (0 votes)
19 views40 pages

Probability AEM

The document is an introduction to probability and statistics, covering fundamental concepts such as experiments, sample spaces, events, and their probabilities. It explains the relationships between events, including unions, intersections, and complements, as well as the rules for calculating probabilities. Additionally, it discusses random variables, probability distributions, and key statistical measures like mean and standard deviation.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
19 views40 pages

Probability AEM

The document is an introduction to probability and statistics, covering fundamental concepts such as experiments, sample spaces, events, and their probabilities. It explains the relationships between events, including unions, intersections, and complements, as well as the rules for calculating probabilities. Additionally, it discusses random variables, probability distributions, and key statistical measures like mean and standard deviation.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 40
Introduction to Probability and Statistics Twelfth Edition Robert J. Beaver ¢ Barbara M. Beaver * William Mendenhall Presentation designed and written by: Barbara M. Beaver What is Probability? ¢ In Chapters 2 and 3, we used graphs and numerical measures to describe data sets which were usually samples. ¢ We measured “how often” using Relative frequency = f/n * As n gets larger, Sample ———> Population And “How often” _ = Relative frequency ——> Probability Copytight £2006 Brooks/Cole A division of Thomson Learning, Ine. Basic Concepts * An experiment is the process by which an observation (or measurement) is obtained. ¢ Experiment: Record an age ¢ Experiment: Toss a die ¢ Experiment: Record an opinion (yes, no) * Experiment: Toss two coins Copyright ©2006 Brooks/Cole Adivision of Thomson Learning Ine. Basic Concepts @ & * A simple event is the outcome that is observed ona single repetition of the experiment. — The basic element to which probability is applied. —One and only one simple event can occur when the experiment is performed. ¢ A simple event is denoted by E with a subscript. 006 Brooks/Cole homson Learning, Ine Basic Concepts @ ‘Ba ¢ Each simple event will be assigned a probability, measuring “how often” it occurs. ¢ The set of all simple events of an experiment is called the sample space, S. Example @ & ¢ The die toss: « Simple events: Sample space: S={E,, E,, E3, Ey, E;, Eg} Copyright 2006 Brooks/Cole A division of Thomson Learning, Ine. Basic Concepts |g & ¢ An event is a collection of one or more simple events. °The die toss: —A: an odd number —B: a number > 2 A={E,, E;, E;} B={E;, E,, Es, Eg} Copyright ©2006 Brooks/Cole Adivis of ThomsonLearning, Ine. Basic Concepts |g & * Two events are mutually exclusive if, when one event occurs, the other cannot, and vice versa. *Experiment: Toss a die SEE —A: observe an odd number —B: observe a number greater than 2 | —C: observe a6 _ | B and —D: observe a 3 Copyright £2006 Brooks/Cole A division of Thomson Learning, Ine. The Probability & of an Event ae « The probability of an event A measures “how often” we think A will occur. We write P(A). * Suppose that an experiment is performed n times. The relative frequency for an event A is Number of times A occurs — f° n oh «If we let n get infinitely large, Copyright 2006 Brooks/Cole A division of Thomson Learning, Ine. The Probability r of an Event @ | « P(A) must be between 0 and 1. —If event A can never occur, P(A) = 0. If event A always occurs when the experiment is performed, P(A) =1. * The sum of the probabilities for all simple events in S equals 1. *The probability of an event A is found by adding the probabilities of all the simple events contained in A. Finding Probabilities * Probabilities can be found using —Estimates from empirical studies —Common sense estimates based on equally likely events. *Examples: —Toss a fair coin. 10% of the U.S. population has red hair. Select a person at random. [gQtiiee) 16 Brooks/Cole omscaLearning, Ine. Copyri Adivis: Example ¢ Toss a fair coin twice. What is the probability of observing at least one head? Ist Coin 2nd Coin _ E, P(E) 1/4 P(at least 1 head) = P(E,) + P(E3) + P(E3) =1/4+ 1/4 + 1/4=3/4 Copyright ©2006 Brooks/Cole A division of Thomson Learning. Ine. Example * A bowl contains three M&Ms®, one red, one blue and one green. A child selects two M&Ms at random. What is the probability that at least one is red? Ist M&M ___ 2nd M&M __E; P(E;) P(at least 1 red) = P(RB) + P(BR)+ P(RG) +P(GR) =4/6 = 2/3 1/6 Copyrigt Adivis ht ©2006 Brooks/Cole of ThomsoaLearning, Ine Event Relations The union of two events, A and B, is the event that either A or B or both occur when the experiment is performed. We write AUB AUB Copyright ©2006 Brooks/Cole A division of Thomson Learning, Ine. ° Event Relations The intersection of two events, A and B, is the event that both A and B occur when the experiment is performed. We write AQ B. ANB + If two events A and B are mutually exclusive, then P(A B)=0. Copyright £2006 Brooks/Cole A division of Thomson Learning, Ine. Event Relations * The complement of an event A consists of all outcomes of the experiment that do not result in event A. We write AC. Ss AC Adivis Learning, Ine. Calculating Probabilities for Unions and Complements There are special rules that will allow you to calculate probabilities for composite events. The Additive Rule for Unions: For any two events, A and B, the probability of their union, P(A U B), is P(AUB)= P(A) + P(B)— P(ARB) Copyright e2006 Brook Core division of ThomsoaLearning Ine. xample: Additive Rule Example: Suppose that there were 120 students in the classroom, and that they could be classified as follows: A: brown hair |__| Brown | Not Brown | ree Pe B: female P(B) = 60/120 P(AUB) = P(A) + P(B) — P(AMB) = 50/120 + 60/120 - 30/120 Check: P(AUB) = (20+ 30+ 309/120 = 80/120 = 2/3 A Special Case When two events A and B are mutually exclusive, P(AMB)=0 and P(AUB) = P(A) + P(B). A: male with brown hair [_____[Brown [Not Brown | may 20120" aed | B: female with brown hair P(B) = 30/120 Calculating Probabilities for Complements We know that for any event A: -P(AO AC) =0 Since either A or A© must occur, P(AV AY) =1 so that P(A U AS) = P(A)t P(AS) = 1 P(A°) =1— P(A) c Adivis Example Select a student at random from the classroom. Define: A: male |__| Brown [Not Brown _| rey =aian frie pe sation A and B are P(B) = 1- P(A) complementary, so that = |- 60/120 = 40/120 Copyright Brooks/Cole Adivision Calculating Probabilities for Intersections In the previous example, we found P(A > B) directly from the table. Sometimes this is impractical or impossible. The rule for calculating P(A. B) depends on the idea of independent and dependent events. Two events, A and B, are said to be independent if and only if the probability that event A occurs does not change, depending on whether or not event B has occurred. Adivision of Thomsen Learning, Ine. Conditional Probabilities ¢ The probability that A occurs, given that event B has occurred is called the conditional probability of A given B and is defined as P(AAB) P(A|B)= if P(B) 40 moot Example 1 * Toss a fair coin twice. Define —A: head on second toss —B: head on first toss P(AIB) = 4 V4 P(A|not B) = % 1/4 mi P(A) does not OprroMMnYNee ToT Wages B happens or ie} tens Example 2 ¢ A bowl contains five M&Ms®, two red and three blue. Randomly select two candies, and define —A: second candy is red. ~B: first candy is blue. P(A|B) =P(2"! red| 1st blue)= 2/4 = 1/2 P(A\not B) = P(2™ redj1st red) = 1/4 P(A) does change, depending on A and B are whether B happens dependent! or not... siCole Learning, Ine. Defining Independence * We can redefine independence in terms of conditional probabilities: Two events A and B are independent if and only if P(A|B)=P(A) or P(BIA) = P(B) Otherwise, they are dependent. * Once you’ve decided whether or not two events are independent, you can use the following rule to calculate their intersection. Ad ison i Thoucat casi te The Multiplicative Rule for Intersections ¢ For any two events, A and B, the probability that both A and B occur is P(A B)= P(A) P@ given that A occurred) = P(A)P(BIA) ¢ If the events A and B are independent, then the probability that both A and B occur is P(AN B)= P(A) PB) Example 1 In a certain population, 10% of the people can be classified as being high risk for a heart attack. Three people are randomly selected from this population. What is the probability that exactly one of the three are high risk? pO ees Define H: high risk N: not high risk P(exactly one high risk) = P(HNN) + P(NHN) + P(NNH) = P(H)P(N)P(N) + P(N)PCH)P(N) + PCN)P(N)P(H), = (19.909) + CDEDC9I) + C9CNCD= 36.19)? = .243 Copyright £2006 Brooks/Cole A division of Thomson Learning. Ine. Example 2 ee Suppose we have additional information in the previous example. We know that only 49% of the population are female. Also, of the female patients, 8% are high risk. A single person is selected at random. What is the probability that it is a high risk female? Define H: high risk F: female From the example, P(F) = .49 and P(HI|F) = .08. Use the Multiplicative Rule: P(high risk female) = P(HAF) = P(F)P(H|F) =.49(.08) = .0392 Copyright £2006 Brooks/Cole A division of Thomson Learning, Ine. Random Variables ¢ A quantitative variable x is a random variable if the value that it assumes, corresponding to the outcome of an experiment is a chance or random event. ¢ Random variables can be discrete or continuous. Examples: ¥x = SAT score for a randomly selected student ¥x = number of people in a room at a randomly selected time of day vx = number on the upper face of a randomly tossed die Copyright ¢2006 Brooks/Cole A division of Thomson Learning Ine. Probability Distributions for Discrete Random Variables * The probability distribution for a discrete random variable x resembles the relative frequency distributions we constructed in Chapter 1. It is a graph, table or formula that gives the possible values of x and the probability p(x) associated with each value. We must have 0< p(x) p(x) =1 Copstight 2006 Brooks/Cole A division of Thomson Learning, Ine. * Toss a fair coin three times and define x = number of heads. HHT HTH THH HT, THT TTH ey 1/8 1/8 1/8 1/8 1/8 1/8 1/8 1/8 Example P(x =0)= 1/8 P(x = 1)= 3/8 P(x = 2)= 3/8 P(x = 3)= 1/8 a} oF ov ITT Mean son of Thomson Learning, Ine. Probability Distributions ¢ Probability distributions can be used to describe the population, just as we described samples in Chapter 1. —Shape: Symmetric, skewed, mound-shaped... — Outliers: unusual or unlikely measurements —Center and spread: mean and standard deviation. A population mean is called p and a population standard deviation is called o. Copyright 52006 Brooks/Cole A division of Thomson Learning, Ine. The Mean and Standard Deviation * Let x be a discrete random variable with probability distribution p(x). Then the mean, variance and standard deviation of x are given as Mean : sg = > xp(x) Variance :o° = X(x— 2) p(x) Standard deviation :o =Vo7 Copyright £2006 Brooks/Cole A division of Thomson Learning, Ine. Example ¢ Toss a fair coin 3 times and record x the number of heads. jo vs fo e158] (L508) = 7Adis Some ots cat earolag tae Key Concepts I. Experiments and the Sample Space 1. Experiments, events, mutually exclusive events, simple events 2. The sample space 3. Venn diagrams, tree diagrams, probability tables Il. Probabilities 1. Relative frequency definition of probability 2. Properties of probabilities a. Each probability lies between 0 and 1. b. Sum of all simple-event probabilities equals 1. 3. P(A), the sum of the probabilities for all simple events in A Copyright £2006 Brooks/Cole A division of Thomson Learning. Ine. Example ¢ The probability distribution for x the number of heads in tossing 3 fair coins. + Outliers? IT Center? Copyright £2006 Brooks/Cole A division of Thomson Learning. Ine. Key Concepts Mil. Counting Rules 1. mn Rule; extended: mn Rule 2. Permutations: 3. Combinations: IV. Event Relations 1. Unions and intersections 2. Events a. Disjoint or mutually exclusive: P(A [ B) =0 b. Complementary: P(A) = 1 — P(A°) Copyright ©2006 Brooks/Cole A division of Thomson Learning Ine. Key concep 3. Conditional probability: 4. Independent and dependent events 5. Additive Rule of Probability: P(AUB) = P(A) + PCB) PAB) 6. Multiplicative Rule of Probability: P(A B)= PAPA) 7. Law of Total Probability 8. Bayes’ Rule 06 Brooks/Cole Adivis mson Learning, Ine Key Concepts V. Discrete Random Variables and Probability Distributions 1. Random variables, discrete and continuous 2. Properties of probability distributions O< p(x) < land p(x)=1 3. Mean or expected value of a discrete random variable: |Mean: ge = Si xp(x) 4. Variance and standard deviation of a discrete random variable:

You might also like