0% found this document useful (0 votes)
10 views27 pages

EA311 1st Part

The document outlines the fundamentals of introductory engineering statistics, covering topics such as probability definitions, sampling distributions, and hypothesis testing. It explains various probability interpretations, sample spaces, events, and the axioms of probability. Additionally, it includes assessment criteria for the course and examples to illustrate statistical concepts.

Uploaded by

jamesenesi1000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
10 views27 pages

EA311 1st Part

The document outlines the fundamentals of introductory engineering statistics, covering topics such as probability definitions, sampling distributions, and hypothesis testing. It explains various probability interpretations, sample spaces, events, and the axioms of probability. Additionally, it includes assessment criteria for the course and examples to illustrate statistical concepts.

Uploaded by

jamesenesi1000
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 27
Los JN S| FN ' NTRODUCTORY ENGINEERING STATISTICS (3 UNITS) Axiomatic definitions of probability. Basic rules of probability. Bayes formula. Perrnutation and combinations. Tabular and graphical representation of samples. Measures of central tendency and dispersion. Random variable. hypergeometric, binomial, polsson Mean and variance of a distribution. onal distributions. Probability distributions: rectangular, and normal. Mathematical expectation. Bivariate distributions, Joint, marginal, and conditi Covariance; correlation coefficients. Regression and correlation. Method of least. squares, Regression curves. Random sampling. Sampling distributions. Expected value. Standard error. The Central limit theorem. Student's t, x2 and F distributions. Confidence intervals for mean, proportion, and ratio of variances. Elements of tests of hypothesis: critical region, significance level, type I and Type Il errors. Assessment: Attendance (10%); Test (20%); Examination (70%) B Ras a 4 iN ELEMENTS OF PROBABILITY 3.1 INTRODUCTION ‘The concept of che probability ofa particular event of an experiment is subject to various meanings or interpretations. For instance, if geologist is quoted as saying that “there is 2 60 percent chance of oil in a certain region,” we all probably have some intuitiveNdea 4s to what is being said, Indeed, most of us would probably interpret this scarementin one of ewo possible ways: either by imagining that 1. the goologise feels that, over the long run, in 6D percent of the regions whose outward environmental conditions are very similar to the conditions that prevail in the region under consideration, there will be oil; or, by imagining that 2. the geologist believes that ic is more likely chat the region will contain oil than i is thar it will not; and in fact. is a measure of the geologists belief in the hypothesis that the region will contain oil. The ewo foregoing interpretations of the probability of an event are referred co as being, the frequency interpreration and the subjective (or personal) interpretation of probability. In the frequency interpretation, the probability of a given outcome of an experiment is considered as being a “property” of that outcome. Ic is imagined that this propercy can be operationally determined by continual repetition of the experiment — the probability of the outcome will chen be observable as being the proportion of the experiments thac result in the outcome. This is the interpretation of probability chac is mosc prevalent among scientists. In the subjective interpretation, the probability of an outcome is not thought of as being a property of the outcome but rather is considered a statement about the beliéfs of the person who is quoting the probability, concerning the chance that the outcome will occur, Thus, in this interpretation, probability becomes a subjective or personal concept and has no meaning outside of expressing one’s degree of belief. This interpretation of probability is often favored by philosophers and cereain economic decision makers, 58 (7 se (per 3: Elements of Probably eqatdless of which interpretation one gives to probability, however, there is a general consensus that the mathematies of probability ate the same in either ease, For instance, if you chink thac the probability chat ie will rain tomorrow is .3 and you feel chat che probability hac it will be cloudy but without any rain is .2, then you should feel that the probability that ic will either be cloudy or rainy is .5 independently of your individual interpretation of the concept of probability. In this chapter, we present the accepred rules, or axioms, used in probability theory, As a preliminary to this, however, we need to study the concept of the sample space and the events of an experiment. 3.2 SAMPLE SPACE AND EVENTS Consider an experiment whose outcome is not predictable with certainty in advance. Although the outcome of the experimenc will nor be known in advance, let us suppose that the set of all possible outcomes is known. This set of all possible outcomes of an experiment is known as the sample space of the expetiinent and is denoted by S. Some ‘examples are the following 1. If the outcome of an experiment consists in the decermination of the sex of a newborn child, then S=(g4) where the outcome g means that the child isa gil and 6 that ic isa boy. 2, If the experiment consists of the running of a race among the seven horses having post positions 1, 2, 3, 4, 5, 6, 7, then S = (all orderings of (1,2, 3,4, 5,6, 7)} The outcome (2, 3, 1, 6, 5: 4, 7) means, for instance, thar the number 2 horse is first, chen che number 3 horse, chen the number 1 horse, and so on. 3. Suppose we are interested in determining the amount of dosage that must be given to a patient until chat patient reacts positively. One possible sample space for this experiment is to let S consist of all che positive numbers. That is, let 5 = (0,00) where the outcome would be x if the patient reacts to a dosage of value x but fot 10 any smaller dosage. ‘Any subsec Fof the sample space is known as an event, Thats, an eventisaset consisting of possible ourcomes of the experiment, If che outcome of the experiment is contained in E, then we say that E has occurred. Some examples of events ate the folloveing. 32 Sample Space and Events 9 In Example 1 if £ = {g), then Eis the evene that the child is a girl. Similarly, if = {6}, then Fis the event that the child is a boy. In Example 2 if E all outcomes in S starting with a 3) . then Fis the event chae the number 3 horse wins the race. For any two events Eand F ofa sumple space S, we define the new event EU F, called the union of the events Eand F, to consist of all outcomes that are either in Eorin For in both Eand F. That is, che event EUF will occuc if either E or F occurs. For instance, in Example 1 if E = (g} and ¥ = (6), then BUF = (g,6). That is, EU F would be the whole sample space S. In Example 2if £ = (all outcomes starting with 6) is the evene that che number 6 horse wins and F = {all outcomes having 6 in the second position) is the event that the number 6 horse comes in second, then BU F is che event that the number 6 horse comes in either frst or second. Similarly, for any ewo events E and F, we may also define the new event EF, called the intersection of E and F, to consist of all outcomes that are in both E and F. That is, the event EF will occur only if both E and F occur. For instance, in Example 3 if E =.(8, 5) is the event that che required dosage is les than 5 and F = (2, 10) is the event that it is between 2 and 10, chen ZF = (2,5) is the event that the required dosage is between 2 and 5. In Example 2 if E = (all outcomes ending in 5) is che event chac horse number 5 comes in last and F = (all outcomes starting with 5} is che event hat horse number 5 comes in first, then the event EF does not contain any outcomes and hence caniot occur To give such an event a name, we shall refer to it as the null event and denote it by #. Thus # refers to the event consisting of no outcomes. If EF = &, implying chat E and F cannot both occur, then Eand Fare said to be mutually exclusive. For any event Z, we define the event E*, referred to as the complement of E, to consist of all outcomes in the sample space S that are not in E, Thar is, B* will occur if and only if E does not occur. In Example 1 if E = {6} is the event that the child is a boy, then E° = (g} is the evene that ic is a girl. Also note that since the experiment must result in some outcome, it follows that S* = 0. For any ewo events E and F, ifall of the outcomes in F are also in F, chen we say chat Fis contained in Fand write E C F (or equivalently, F > £). Thus if EC F, then the occurrence of E necessarily implies the occurrence of F. If EC F and F ¢ E, then we say that E and Fare equal (or identical) and we write E =F. We can also define unions and intersections of more than two events. In particu lar, che union of the events Z, , ..., Ey, denoted either by E\ UE2U --- UE, or by Urz;, is defined co be the event consisting of all outcomes that are in Ej fie af kab doe i = 1,2,...,n, Similacly, the intersection of the events Bj, i = 1,2,...4% denoted by Ey Ey» By is defined to be the event consisting of those outcomes that are in all of the events Ej, i = 1,2,...,m.In other words, the union of the Ej occurs when at least one of the events Ej occurs; the intersection occurs when all of the events E; occut. bs Ghapter 3: Clements @° Frelahiy 3.3. VENN DIAGRAMS AND THE ALGEBRA OF EVENTS A graphical representation of events thac is very useful for illustrating logical relations: among them is the Ven diagram, The sample space S is represented as consisting of all the points in a large rectangle, and the events EF, G,.. . are represented as consisting of all che poines in given circles within the rectangle. Events of interest can then be indicated by shading appropriace tegions of the diagram, For instance, in the three Venn diagrams shown in Figure 3.1, the shaded areas represent respectively the events £ UF, EF, and E* ‘The Venn diagram of Figure 3.2 indicates that EC F. “The operations of forming unions, intersections, and complements of events obey certain rules not dissimilar to the rules of algebra. We list a few of these, Commutativelaw BUF = FUE EF= FE . Associativelaw (EUF)UG=EU(FUG) (EF) Distriburivelaw (EUF)G = EGU FG EPUG ‘These relations are verified by showing thac any outcome thar is contained in the event on the left side of che equality is also contained in the event on the tight side and vice versa. ‘One way of showing this is by means of Venn diagrams. For instance, the distributive law ay be verified by the sequence of diagrams shown in Figure 3.3. (a) Shaded rogon: EUF 1B) Shaded agion: EF {@) Shaded rion: E* FiGaNE 3.1 Venn diagrams. eer iNET Venn diagra, 34 roms of Prbacy ” o G (0) Shaded region: FG —_(c) Shaded region: (EUF)G (@) Shaded region: EG sade oa (Eee ANGIE 33” Proving the cierabutve law. “The following useful relationship between the three basic operations of forming unions, intersections, and complements of events is known as DeMorgam 5 laws. (EU FP) = BFE (EFY = SUF 3.4 AXIOMS OF PROBABILITY Te appears 0 be an empirical fact that if an experiment is consinually repeated unde? che 1n for any event E, the proportion of time that che outcome is ‘exact same conditions, the f repetitions increases. For contained in E approaches some constant value as the number o inscance, ifa coin is continually flipped, chen the proportion of Rips resulting in heads will approach some value as the number of Hips increases. eis this constane limicing frequency that we often have in mind when we speak of che probability of an event. From a purely mathematical viewpoint, we shall suppose that for each event Z of an experiment having a sample space S there isa number, denoted by P(E), that is in accord with che following three axioms. AXIOM 1 0 23, this probability is less than 4. Thae's, if there ate 23 or more people in a room, then the probability that ac least two of them have the same birthday exceeds $. Many people are initially surprised by this resulr, since 23 scems so small in relation t0 365, the number of days of the year. However, every pair of individuals has probability 36s = xf; of having the same birthday, and in a group of 23 people there are (*9) =253 different pairs of individuals. Looked at this way, the resule no longer seems so surprising. El 3.$ CONDITIONAL PROBABILITY ln this section, we introduce one of the most important concepts in all of probability theory — chae of conditional probability. Irs importance is ewofold. In che firs place, we ‘ue often interested in calculating probabilicies when some partial information concerging ‘sult of the experiment is available, or in recalculating them in light of additional information, In such situations, the desired probabilities are conditional ones. Second, as «d of a bonus, ic often curns out that che easiest way to compute the probability of an is to frst “condition” on the occurrence or nonoccurrence of a secondary event. Acan illustration of a conditional probability, suppose that one tolls a pair of dice. The sample space $ of this experiment can be taken co be the following ser af 36 outcomes 6} (Ga), $= 1,2,3,4,5,6 j= 1,2, where we say that the outcome is (i, ) ifthe firse die lands on side 7 and the second on side j. Suppose now that each of the 36 possible outcomes is equally likely co occur and hus has probability 3. (In such a situation we say that the dice are fair.) Suppose further that we observe that the first die lands on side 3. Then, given chis information, what ig the probability chat the sum of the two dice equals 8? To calculate this probability, we reason 2s follows: Giver that the initial die is a 3, there can be at most 6 possible outcomes of our experimedt, namely, (3. 1), G, 2), (3, 3), (3, 4), (3, 5), and (3, 6), In addition, because cach of these outcomes originally had the same probability of occurring, they should sill have equal probabilities. That is, given that che first die is a 3, then the (conditional) probability of each of the outcomes (3, 1), (3, 2), (3. 3), (3, 4). (3, 5), (3, 6) is 2, whereas the (conditional) probability of the other 30 points in che sample space is 0. Hence, che desired probability will be 3. (Ewe let Band F denote, respectively, the event that the sum of the dice is 8 and the event thar che first die is @ 3, then the probability just obtained is called the conditional tt Chapter 3: Elements of Probbiity FIGURE 3S PUELF) probabilicy of # given thac Fhas occurred, and is denoted by P(E\F) ‘A general formula for P(E|F) that is valid for all events E and F is derived in the Same manner as just described, Namely, ifthe event F occurs, then in order for Eto occur it is necessary that the accual occurvence be a point in borh Zand F; that is, it must be in EF. Now, since we know that F has occurred, ic follows thar F becomes our new (reduced) sample space and’ hence the probability that the event EF occurs will equal the probability of EF relative to the probability of F. That is, aa P(EF) P(EIF) = PF) GB.6.1) Nore that Equation 3.6.1 is well defined only when P(F) > Oand hence P(E|F) is defined only when PCF) > 0. (See Figure 3.5.) “The definition of conditional probability given by Equation 3.6.1 is consistent with the interpretation of probability as being a long-run relative frequency. To see this, suppose that a lange number n of repetitions of the experiment ate performed. Then, since P(F) is the long-run proportion of experiments in which F occuts, it follows that F will occur approximately nP(F) times. Similarly, in approximately »P(EF) of these experiments, both Fand Fwill occur. Hence, of the approximacely nP(F) experiments whose outcome is in F approximately nP(EF) of them will also have their outcome in E. That is, for those experiments whose outcome is in F, the proportion whose outcome is also in Eis approximately nPEF) _ P(EF) nF) PF) Since this approximation becomes exact as becomes larger and larger, it follows that (3.6.1) gives the appropriate definition of the conditional probability of & given that Fhas occurred. EXAMPLE 3.6a A bin contains 5 defective (that immediately fail when put in use), 10 partially defective (that fil after a couple of hours of use), and 25 acceptable transistors A transistor is chosen at random from the bin and put into use. [Fit does not immediarely fail, what is che probability it is acceptable? , fs) a SOLUTION Since che transistor did not immediately fail, we know thar itis not one of the jectives and so the desired probability is: P(acceptable|not defective) __ Placceptable, not defective) P{nox defective) . __Placceptable} = Pinor defective where che last equality fllows since che eransistor will be both acceprable and not defective iFicis acceptable, Hence, assuming that each of the 40 transiscors is equally likely to be + chosen, we obtain that 25/40 =e = 57 35/40 Ie should be noted chat we could also have derived this probsbility by working directly swith she reduced sample space. That is, since we know that the chosen transistor 16 106 ‘Tefecive. the problem reduces to computing the probability thac a transistor chosen PEF) = Deine) 7.2) “Thus, Equation 3.7.2 shows how, for given events F, Fay..+»Fu of which one and only ‘one must occur, we can compute P(E) by first “conditioning” on which one of the Fi teeurs, That is, ie states thac P(E) is equal to a weighted average of P(E|F;), each cerm being weighted by the probability of the event on which it is conditioned. Suppose now chat E'has occurred and we are interested in determining which one of F; also occurred, By Equation 3.7.2, we have that PEF) ~ PE) __PUELR)PCR) ¥ PEL PR) (3.7.3) Equation 3.7.3 is known as Bayes’ formula after the English philosopher Thomas Baycs. If we think of the events Fas being possible “hypotheses” about some subject matter, then ~ Chapter 3: Elements of Probably Bayes’ formula may be interpreted as showin, , i Us how opinions about these hypotheses held before che experimen [that i, che P(F)} should be modified by che evidence of the experiment, EXAMPLE 3.7f A plane is missing and ir is presumed that it was equally likely to have gone down in any of three possible regions. Let 1 — ay; denote the probability the plane will be found upon a search of the ith region when the plane is, in fact, in that region, 1=1,2.3. (The constants a ate called overlook probabilities because they represent the probability of overlooking the plane; they are generally attributable to che geographical and environmental conditions of the regions.) What s the conditional probability that dhe plane is in the ith region, given that a search of region 1 is unsuccessful, / = 1,2, 3? SOLUTION Let &j,i = 1,2,3, be the event that the plane is in region i; and let £ be the event that a search of region 1 is unsuccessful. From Bayes’ formula, we obsain _ PER) PORE) = Sony . — PER) PR) =z D PELRAP(R) ra _ (aus) © (en X(1/3) + (03) + (/3) a ay +2 Forj = 2.3, PUEIR)P(R) P(E) * _ Mae) ~ (e343 + 13 1 ~ at?" PIRIE) = Thus, for instance, ifr) = 4, then the conditional probability that the plane is in region 1 given that a search of that region did not uncover itis } 5 3.8 INDEPENDENT EVENTS “The previous examples in this chapeer show that P(E|F), the conditional probabilicy of E given F, is not generally equal to P(E), the unconditional probability of £. In gther v Sper 3: Elements of Probably Thus if Eis independent of F, then the infomation asco whether or not Fhas occoned, Suppose now that E is independent of F and is ako i necessatily independent of FG? The answer, : following example, Probability of B's occurrence is unchanged by independent of G. Is E%hen somewhat surprisingly, is no. Consider the eat Bate Two faic dice are thrown, Lee B denote the evene that the sum of the dice is? Let F denote the event that che first die equals 4 and let T' be the event that the second die equals 3. Now it can be shown (see Problem 36) that E> is independent of F and that E is also independent of 7; but clearly & independent of Paar Et + but clearly £> is not independene of FT [since lt would appear to follow from the foregoing example that an appropriate definition of the independence of thee events E, F, and G would have to go further than merely assuming thatall of the (3) pairs ofevencs are independenc. We are thus led to the following, definition, » Definition ‘Ve three events E, #, and Gare said co be independent if P(EFG) = P(E)P(F)P(C) P(EF) = PUE)P(P) PEG) = PE)P(G) P(FG) = P(F)P(G) Ieshould be noted that ifthe events £, F, Gareindependent, then Ewill be independent ofany event formed from Fand G. For instance, E's independent of F U G since P(E(F UG) = P(EF U BG) = P(EF) + P(EG) — P(EFG) = PUE)P(P) + PE )P(G) — PE) PRG) = P(E)[P(F) + P(G) — P(FG)] = P(E)P(FUG) Of course we may also extend the definition of independence to more than three The events Ei, Ez,...1£, are said to be independent if for every subser sa Bpr © n, of these events . even Ey, By Py. + Ey) = Ply) P(Ey) + PE) atk words, knowing that F has o¢ curred generally the special cases where PLELE) does in ‘ae cae of F, That is, B is independent of F if knowledge thar F hes the probability chat E occurs, “ Since PEF) = P(EF Pir), hances of E's occurrence. tn say that F is independent occurred does nat change ‘we see chat Zis independent of Fif P(EF) = PE)P(F) G81) Since chis equation is a symmetric in E and F, it shows that whenever Eis independent of Ps0 is F of E, We thus have the following _— Definition Two events E and F are said co be independent if Equation 3.8.1 holds. Two events # and F that are nor independent are said to be dependent. EXAMPLE 3.62. A card is selected at random from an ordinary deck of 52 playing cards. If Ais che event that the selected card is an ace and His the event thar it is a heart, then A and H are independent, since P(AH) = 35, while P(A) = & and P(H) = 3. EXAMPLE 3.56 If we lec Z denote the event that the next president is a Republican and F the event that there will be a major earthquake within the next year, then most people would probably be willing to assume that £ and Fare independent. However, there would probably be some controversy over whether i is reasonable to assume that Eis independent ‘of G, where Gis the event that there will bea recession within the next wo years. Bl We now show that if His independent of F then E is also independent of F PROPOSITION 3.8.1 If E and Fare independent, then so are Zand F*. Proof Assume that £ and Fare independent. Since £ = EF U EF*, and EF and EF are obvi- ously mutually exclusive, we have that P(E) = P(EF) + P(EF*) = P(E)P(F) + P(EF®) by the independence of E and F or equivalently, P(EF*) = P(E\(1 — PCF) = P(E)P(FS) and che resule is proven, 9 . 2 Indpenden Even ® AGRE TT lel stem: functions ifeurrent los fom A to B Tis sometimes the case chat the probability experiment under consideration consists of performing a sequence of subexperimencs. For instance, if the experiment consists of continually tossing a cuin, chen we may think of each toss as being a subexperimenc. In ‘many cases it is reasonable to assume that the outcomes of any group ofthe subexperiments have no effect on the probabilities of the outcomes of che other subexperiments.If such is the ease, then we say that che subexperiments are independent. EXAMPLE 3.84 A system composed of n separate components i said to be a parallel system if ic Functions when at least one of the components functions. (See Figure 3.7.) Por such a system, if component i, independent of other components, functions with probability bind = 1,....2 hac is the probability the system functions? SOLUTION. Let A; denote the event that component i functions. Then — Plsystem does not function} {system functions) = — Plall components do not function} PAAR Ay) [12a _by independence ECPI CRIES) EXAMPLE 3.8e A sec of & coupons, each of which is independently a type j coupon wich probabiley pj. Ste aj = 1s is collected. Find the probability chat che ser contains ‘2 eype j coupon given that it contains a type is? # j. SOLUTION Let A, be the event that the set contains a type r coupon. Then _ Paya PUA) = a To compute P(A;) and PCA;A;), consider che probability of their complements: P(A) = 1 - PA) 1 — P{no coupon is type f} =1-(~pit PUA) = 1 AUD m1 (PA) + Pla) - PAA) = 1=(1 pit = (1p) Plao coupon is ype oF Ped) ata pit dp ep where the final equality flaws because each of the & coupons is indepen of type ior of type j with probability 1 — ps — 4j- Consequently. yet — (1 pil yyy = Ooo a \densly, neither

You might also like