0% found this document useful (0 votes)
16 views11 pages

Maths Project

Is

Uploaded by

abhinav13byq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
16 views11 pages

Maths Project

Is

Uploaded by

abhinav13byq
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 11
Probability Basic Definitions Random Experiment: © An experiment, whose all possible outcomes are known in advance but the outcome of any specific performance cannot predicted before the completion of the experiment © Eg: Tossing of a coin. Sample-space ° A set of all possible outcomes associated with same random experiment > Denoted by ‘S’. Eg: In the experiment of tossing a die, If we are interested in the number that shows on the top face, then sample space would be = {1, 2,3, 4,5, 6} ° Experiment or Trial © Itisa series of action where the outcomes are always uncertain. © Eg:- Tossing of a coin, Selecting a card from deck of cards, throwing a dice, Event © Subset of sample - space. > In any sample space we may be interested in the occurrence of certain events rather than in the occurrence of a specific element in the sample space. Simple Event © Ifaneventis a set containing only one element of the sample-space Compound Event > Acompound event is one that can be represented as a union of sample points o Eg Event of drawing a heart from a deck of cards is the subset A = {heart} of the sample space S = {heart, spade, club, diamond} Therefore A is a simple event, None the event B of drawing a red card is a compound event since B= (heart U diamond] = {heart, diamond) Probability © Ifa random experiment can result in any one of N different equally likely outcomes, and if exactly n of these outcomes favours to A, Then the probability of event A, P (A) = n/N ie. © Remarks: + If the probability of certain event is one, it doesn’t mean that event is going to happen with certainty * It's just predicting that, the event is most likely to occur in comparison to other events, Predictions depend upon the past information and of course also on the way of analysing the information at hand + Similarly if the probability of certain event is zero, it doesn’t mean that, the event can favourable cases/total no, of cases. never occur! Mutually exclusive Event © Iftwo events are mutually exclusive they cannot occur simultaneously Independent Events © Events are said to be independent if the occurrence or non-occurrence of one does not affect the occurrence or non-occurrence of other. Exhaustive Event © Asset of events is said to be exhaustive if the performance of random experiment always result in the occurrence of at least one of them Conditional Probability + The conditional probability of an event B is the probability that the event will occur given the knowledge that an event A has already occurred. * This probability is written P (B| A), notation for the probability of B given A. © In the case where events A and B are independent (where event A has no effect on the probability of event B), the conditional probability of event B given event A is simply the probability of event B, that is P(B). * Ifevents A and B are not independent, then the probability of the intersection of A and B (the probability that both events occur) is defined by P (A and B) = P (A) P (B| A). * If Band Fare two events associated with the same sample space of a random experiment, the conditional probability of the event E given that F has occurred, PEAF) PER) = Sap Le. P (EF) is given by provided P(F) 40 z 9 4 a ICP(A)= 75. P(B)= 75 and P(A VB) = 7, evaluate P(AIB). P(AMB) PART Properties of conditional probability Let E and F be events of a sample space S of an experiment, then we have Property 1 PGIF)=PE|F)=1 We know that SOF) PO, PSP be) Pe PAR_PO) Also re TS pee! Thus PSF) = PPP) =1 Property 2 If A and Bare any two events of a sample space S and F is an event of S such that P (F) #0, then P(A VU B)|F)= P(AIF) + POBIF) — P(A 2 B)F) In particular, if A and B are disjoint events, P((AWB)F) = P(AJF) + P(BIF) We have PUAUB) AF] POF) _ PAP u (BaP) Pr) (by distributive law of union of sets over intersection) _ PAD F)+PBOF)-PADB OF) - Pr) P(AUB)F) = _ PAAR), PBAP)_PUANB) OF) PR) PCF) PF) = P(AIF) + P(BIF) - P(AMB)F) When A and B are disjoint events, then P(A OB)F)=0 = P(A VU B)F) = P(AIF) + P(BIF) Property 3 P(E) = 1 — P(EIE) From Property 1, we know that P(S|F)=1 = PEVEP)=1 since S=EUE = P(EIF) + P (EF) =1 since E and E’ are disjoint events ‘Thus, P(E'[F) = | — P(EIF) Multiplication Theorem on Probability © Let Eand F be two events associated with a sample space S. * Conditional probability of event -E given denoted by P(E |F) and is given by P(E OF) PEEIF) = ra ps0 «From this result, we can write that F has occurred — is PEO ¥) = P(E). PUP) o Also, we know that PEA P(FIE) = ———— PF) #0 (FIE) = PE) (P= PEOP or P(FIE) = P(E) (since EF =F OE) ‘Thus, P(E 9 F) = P(E). P(FIE) - @) ‘Combining (1) and (2), we find that P(E OF) = PE) PERE) = P(F) P(E|F) provided P(E) # 0 and P(F) + 0. ‘The above result is known as the Multiplication rule of probability. © Example: © Anurn contains 10 black and 5 white balls, Two balls are drawn from the urn one after the other without replacement. What is the probability that both drawn balls are black? o Solution: = Let E and F denote respectively the events that first and second ball drawn are black. = P(E) =P (black ball in first draw) = 10/15, + Given * First ball drawn is black, ie, event E has occurred, now there are 9 black balls and five white balls left in the urn. «Therefore, the probability that the second ball drawn is black, given that the ball in the first draw is black, is nothing but the conditional probability of F given that E has occurred, © ie, P(PJE) = 9/14 * By seuliplicason rule of probability, we have P(ENF)=P(E)P(FIE) =10/ 15X9/14=3/7 Note: + Multiplication rule of probability for more than two events If B, F and G are three events of sample space, we have P(ENFNG)=PE)PE|E) PGI (ENF) =P) PELE) PIER, + Similarly, the multiplication rule of probability can be extended for four or more events. Independent Events © If Band Fare two events such that the probability of occurrence of one of them is not affected by occurrence of the other. Such events are called independent events. Let E and F be two events associated with the same random experiment, then E and F are said to be independent if P(ENF)=P@).P) Remarks © Two events E and F are said to be dependent if they are not independent, i. if P(ENF)4P(E).P(F) © Sometimes there is a confusion between independent events and mutually exclusive events. = Term ‘independent’ is defined in terms of ‘probability of events’ whereas mutually exclusive is defined in term of events (subset of sample space) + Mutually exclusive events never have an outcome common, but independent events, may have common outcome = Two independent events having nonzero probabilities of occurrence cannot be mutually exclusive, and conversely, ie. two mutually exclusive events having nonzero probabilities of occurrence cannot be independent. © Two experiments are said to be independent if for every pair of events E and F, where E is associated with the first experiment and F with the second experiment, the probability of the simultaneous occurrence of the events E and F when the two experiments are performed is the product of P(E) and P(F) calculated separately on the basis of two experiments, ie, P(E F) = P(E). PP) © Three events A, Band C are said to be mutually independent, if P (A) P (B) PAPC) P(BNC)=P(B) P(C) and P(ANBNC)=P(A) PB) PO If at least one of the above is not true for three given events, we say that the events are not independent © Example + A die is thrown. If E is the event ‘the number appearing is a multiple of 3’ and F be the vent ‘the number appearing is even’ then find whether E and F are independent ? Solution: W.k+ the sample space is S = (1, 2,3, 4, 5, 6} Now E = {3,6}, F= (2, 4,6) and ENF = (6) Then * P(E) =2/6 1/3 «© P(F)=3/6=1/2 PENF)=1/6 Clearly P (EN F) = P (E). P (). Hence E and F are independent events. Bayes’ Theore! : Description © Also called as inverse probability theorem © Consider that there are two bags I and IL * Bag I contains 2 white and 3 red balls * Bag II contains 4 white and 5 red balls. * One ball is drawn at random from one of the bags, * Probability of selecting any of the bags (i.e. 1/2) or probability of drawing a ball of a particular colour (say white) from a particular bag (say Bag 1). © Probability that the ball drawn is of a particular colour, if we are given the bag from which the ball is drawn * To find the probability that the ball drawn is from a particular bag (say Bag Il), if the colour of the ball drawn is given we have to find the reverse probability of Bag Il to be selected when an event occurred after it is known. © Famous mathematician, John Bayes' solved the problem of finding reverse probability by using conditional probability. © Hence named as ‘Bayes theorem’ which was published posthumously in 1763. Definitions: Partition of a sample space + Asset of events E1, Bz, .., Es is said to represent a partition of the sample space S if o EiNEj=@itjij-1,23,-.n E, U Ey VU... U En=S and P (Ei) > 0 for all i= 1,2, ......m. + The events Es, Ez, ... En represent a partition of the sample space $ if they are pairwise disjoint, exhaustive and have nonzero probabilities. Theorem of total probability Let {E:, Ez,..,Eq) be a partition of the sample space S, Suppose that each of the events E1, E>... Ey has nonzero probability of occurrence. Let A be any event associated with S, then P(A) = P(E,) P(AIE,) + P(E,) P(AE,) + ... + PCE) PCAIE,) ooo = SPE) Pae,) Proof Given that E;, Ep... Ea i8 a partition of the sample space S. Therefore, S=E, VE,U..UE, s = @ and E,NE=O 14h 6/5 1,290 ; i Now, we know that for any event A, A=AnS ANE, VE,U..VE) E E =(ANE)U(ANE) UU (ANE) Fig 3.4 Also AOE, and AA E, are respectively the subsets of E, and E,. We know that E, and E, are disjoint, for i # j, therefore, A © E, and A ™ E, are also disjoint for all itis Thus, P(A) = PANE) U(AN EUV (ANE) (ANE) + P(ANE,) + + PANE) Now, by multiplication rule of probability, we have P(A. VE) = P(E) P(AIE,) as P(E) # OVE = 1,2... Therefore, P(A) =P (B,) P (AIE,) ~ P (E,) P (AIE,) +. + PE,)P(AIE,) or P(A) = PE, )PCAIE,) Bayes’ Theorem: Proof If Fy, F: y., Ex are nm non empty events which constitute a _ partition of sample space §, ie. Ey, Ez... En are pairwise disjoint and EiU EyU ... U En = $ and Ais any event of nonzero probability, then PE,)P(AIE,) P(EJA) = for any /= 1, 2,3, 4 DPE, PAE) ra Proof: By formula of conditional probability, we know that ney « PAQED i P(A) P(E, )PiAl = ae (by multiplication rule of probability) P(E,)PCAIE,) (by the result of theurem of total probability) DPE )PAIE,) a Remark The following terminology is generally used when Bayes’ theorem is applied. o The events E1, Ea, ..., En are called hypotheses. ©. The probability P(Ei) is called the priori probability of the hypothesis Ei © The conditional probability P(Ei | A) is called a posteriori probability of the hypothesis Ei. © Also called the formula for the probability of "causes", Since the Ei's are a partition of the sample space S, one and only one of the events Ei occurs (i.e. one of the events E i must occur and only one can occur). Hence, the above formula gives us the probability of a particular Ei, given that the event A has occurred. Random Variables and its Probability Distributions In most of the random experiments and Sample space , we were not only interested in the particular outcome that occurs but rather in some number associated with that outcomes as shown in following examples/ experiments. ° ° ° Experiments In tossing two dice, we may be interested in the sum of the numbers on the two dice, In tossing a coin 50 times, we may want the number of heads obtained In the experiment of taking out four articles (one after the other) at random from a lot of 20 articles in which 6 are defective, we want to know the number of defectives in the sample of four and not in the particular sequence of defective and non-defective articles. © Inall the above experiments, We have a rule which assigns to each outcome of the experiment a single real number. This single real number may vary with different outcomes of the experiment. Hence, it is a variable. Also its value depends upon the outcome of a random experiment and, hence, is called random variable A random variable is usually denoted by X. A random variable can take any real value, therefore, its co-domain is the set of real numbers. Hence, a random variable can be defined as follows + A random variable is a real valued function whose domain is the sample space of a random experiment. * Eg: Consider the experiment of tossing a coin two times in succession + Sample space of the experiment is S = (HH, HT, TH, TT} + IE X denotes the number of heads obtained, then X is a random variable and for each outcome, its value is as given below © X(HH) = 2, X (HT) =1,X (TH) =1,x (TT) =0 + Let Y denote the number of heads minus the number of tails for each outcome of the above sample space S. * Y (HH) =2, Y (HT) = 0, Y (TH) =0, YT) =-2. Hence, X and Y are two different random variables defined on the same sample space * Note: More than one random variables can be defined on the same sample space, Probability distribution of a random variable © Description giving the values of the random variable along with the corresponding probabilities is called the probability distribution of the random variable X. © Ingeneral, the probability distribution of a random variable X is defined as follows: The probability distribution of a random variable X is the system of numbers x = ef i Bi POO: Poo PL Pa where. B20 YD, =1,7=1,2..0 The real numbers ¢,,x,..., x, are the possible values of the random variable X and 1,2,5 0) is the probability of the random variable X taking the value x i © Also for all possible values of the random variable X, all elements of the sample space are covered. Hence, the sum of all the probabilities in a probability distribution must be one. If x is one of the possible values of a random variable X, the statement X = xis true only at some point (s) of the sample space. Hence, the probability that X takes value xi is always nonzero, ie. P(X = xi) #210. ° Mean of a random variable © Mean is a measure of location or central tendency in the sense that it roughly locates a middle or average value of the random variable. Let X be a random variable whose possible values xi, X2,X3,.., Xs occur with probabilities p1, po, Dae Pi Pry Fespectively. The mean of X, denoted by p, is the number = "i.e. the mean of X is the weighted average of the possible values of X, each value being weighted by its probability with which it occurs. The mean of a random variable X is also called the expectation of X, denoted by E(X) ° é E(X)=W= QGP = x,pt xp, +t XD, Thus a © The mean or expectation of a random variable X is the sum of the products of all possible values of X by their respective probabilities. Variance of a random variable © The mean of a random variable does not give us information about the variability in the values of the random variable. © IE the variance is smalll, then the values of the random variable are close to the mean. Also random variables with different probability distributions can have equal means, as shown in the following distributions of X and Y x 7 2 3 4 CON es 2 a 4 8 8 8 8 ¥ “1 o 4 5 6 2 3 1 E00 |e [zal | 12 vey 220 X) = bei xi 4 3x2 44x22 2 7 E(X) = Ig 42x 4345 t= 2.75 Lyn PiegeSyall geil BB B(Y) = “be hs0x2 gaxdesxt- 612? D By OR ‘The variables X and Y are different, however their means are same, ‘The diagrammatic representation of these distributions are shown below: Feo PY) Ys Ys 4s Yl "% Yq ude. aol PIE @ 203 44 a © w Let X be a random variable whose possible values p(x2),-., pls) respectively. Tet 1 =F. (X) be the mean of X. The variance of X, clenoted by Var (X) or @,” is X2,..X0 occur with probabilities p(x), delined as oo? = VarX)= 3), -HP pty) orequivalently o, -E(X—p? o,= /Vart ‘The non- negative number is called the standard deviation of the random variable Another formula to find the variance of a random variable. We know that, vor a9 = Sw? pls) Serena = De mesyy+ Dw? ple) F200 6s) = E¥ wept? Lats) 20 Ea) / is )ox2=24[ sie lan pla) —n or (Sun) or Var (X) = EG?) — [EQQP, where BOC) =}, plx,) Bernoulli Trials and Binomial Distribution Bernoulli trials © The outcome of any trial is independent of the outcome of any other trial. In each of such trials, the probability of success or failure remains constant. Such independent trials which have only two outcomes usually referred as ‘success’ or ‘failure’ are called Bernoulli trials. © Trials of a random experiment are called Bernoulli trials, if they satisfy the following conditions : © There should be a finite number of trials. The trials should be independent. Each trial has exactly two outcomes: success or failure. The probability of success remains the same in each trial. Example: 30 Six balls are drawn successively from an urn containing 7 red and 9 black balls. Tell whether or not the trials of drawing balls are Bernoulli trials when after each draw the ball drawn is (i) replaced (ii) not replaced in the urn Solution (i) The number of trials is finite. When the drawing is done with replacement, the probability of success (say, red ball) is p = 7/16 which is same for all six trials (draws). Hence, the drawing of balls with replacements are Bernoulli trials. (i)When the drawing is done without replacement, the probability of success (i.e., red ball) in first trial is 7/16, in 2nd trial is 6/15 if the first ball drawn is red or 7/15 if the first ball drawn is black and so on. Clearly, the probability of success is not same for all trials, hence the trials are not Bernoulli trials ° 20° Binomial distribution The probability distribution of number of successes in an experiment consisting of n Bernoulli trials may be obtained by the binomial expansion of (q + p)°. Hence, this distribution of number of successes X can be written as x 0 1 2 woe x ao | a PRY gC aPC. a] |". P"| | "Cy Pt The above probability distribution is known as binomial distribution with parameters n and p, because for given values of n and p, we can find the complete probability distribution. The probability of x successes P(X = x) is also denoted by P(x) and is given by =p) This P(x) is called the probability function of the binomial distribution. A binomial distribution with n-Bernoulli trials and probability of success in each trial as p, is denoted by B (n, p) PQ)="Cqrp, x=0,1,. 09

You might also like