0% found this document useful (0 votes)
5 views3 pages

Week 3 - EDA 1

This document outlines the learning outcomes and content for the Engineering Data Analysis course (EDA 1) for the academic year 2020-2021. Key topics include independent events, laws of probability, random variables, and probability distributions, with associated teaching activities and assessment tasks. It also provides references for further reading and examples to illustrate the concepts discussed.

Uploaded by

Raffy Albuna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views3 pages

Week 3 - EDA 1

This document outlines the learning outcomes and content for the Engineering Data Analysis course (EDA 1) for the academic year 2020-2021. Key topics include independent events, laws of probability, random variables, and probability distributions, with associated teaching activities and assessment tasks. It also provides references for further reading and examples to illustrate the concepts discussed.

Uploaded by

Raffy Albuna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Week 3 Term: 1st Semester: 1st Academic Year: 2020-2021

Subject Code: EDA 1 Course Title: Engineering Data Analysis


III. MODULE LEARNING OUTCOMES (PERFORMANCE INDICATORS)
1. Determine the independence of events and use independence to calculate probabilities.
2. Calculate the probabilities of joint events such as unions and intersections from the probabilities of individual events
3. Understand random variables.
4. Determine probabilities from probability mass functions and the reverse
IV. CONTENT TOPIC DISCUSSION (attach detailed content theories/applications and specific learning objectives)
TOPIC 1: Independent Events
TOPIC 2: Laws of Probability
 Addition Rule
 Multiplication Rule
TOPIC 3: Random variable
TOPIC 4: Probability Distributions and Probability Mass Functions
V. TEACHING LEARNING ACTIVITES (TLA’s) (with TLA guides, must be doable thru online)
 Face-to-face lecture discussion, autodidactic learning, online research for further ideas relevant to the topic and lastly,
assessment.
 Reading and self-comprehension
VI. ASSESSMENT TASK (AT’s) / EVALUATION/RUBRICS (WITH ASSESSMENT guides)
 Formative Quiz
 Hands-on Quiz
VII. ASSIGNMENT (include here the target module for the preceding MODULE guides )
 Control Structures
VIII. REFERENCES
 Applied Statistics and Probability for Engineers 6th Edition by D.C. Montgomery, 2014
 Basic Probability and Statistics by Z. Garambas, 2011
 Introduction to Probability and Statistics by J.S. Milton,

TOPIC 1: Independent Events


 If P(𝐵|𝐴) = P(B), i.e., the probability of B occurring is not affected by the occurrence or non-occurrence
of A, then we say that A and B are independent events. This is equivalent to
𝑷(𝑨 ∩ 𝑩) = 𝑷(𝑨)𝑷(𝑩)
𝑷(𝑨|𝑩) = 𝑷(𝑨)
or

𝑷(𝑩|𝑨) = 𝑷(𝑩)

EXAMPLE : Let A and B be events in a sample space S. Show that if A and B are independent, then so are A
and 𝐵.

Answer:

𝑃(𝐴) = 𝑃(𝐴 ∩ 𝐵) + 𝑃(𝐴 ∩ 𝐵 )


𝑃(𝐴 ∩ 𝐵 ) = 𝑃(𝐴) − 𝑃(𝐴 ∩ 𝐵) = 𝑃(𝐴) − 𝑃(𝐴)𝑃(𝐵) = 𝑃(𝐴|[1 − 𝑃(𝐵]) = 𝑃(𝐴)𝑃(𝐵)
Thus, A and B are independent.

TOPIC 2: Laws of Probability


Addition Rule
 The additive rule of probability tells us how to calculate the probability of the union of two events. The
additive rule of probability states that the probability of the union of events A and B is the sum of the
probabilities of events A and B minus the probability of the intersection of events A and B – that is

P(A∪B) = P(A) + P(B) - P(A∩B)

 More generally, if 𝑨𝟏 , 𝑨𝟐 , 𝑨𝟑 are any three events, then

This module is a property of Saint Joseph Institute of Technology (SJIT). The term module refers to an instructional material that focuses on a specific course. Details
and activities vary according to specific content and focus on student-centered learning activities.
No part of this module may be reproduced or transmitted in any form or any means without approval of the CEO/President.
𝑷(𝑨𝟏 ∪ 𝑨𝟐 ∪ 𝑨𝟑 ) = 𝑷(𝑨𝟏 ) + 𝑷(𝑨𝟐 ) + 𝑷(𝑨𝟑 ) − 𝑷(𝑨𝟏 ∩ 𝑨𝟐 ) − 𝑷(𝑨𝟐 ∩ 𝑨𝟑 ) − 𝑷(𝑨𝟑 ∩ 𝑨𝟏 )
+ 𝑷(𝑨𝟏 ∩ 𝑨𝟐 ∩ 𝑨𝟑 )
Generalizations to n events can also be made.

EXAMPLE 2.1: What is the probability that a card selected from a deck will be either an ace or a spade?

Answer: The probability of an ace is 4/52, the probability of a spade is 13/52, and the probability of getting and
ace and a spade is 1/52. Therefore, using the addition formula

(4/52)+ (13/52) – (1/52) = (16/52)

Multiplication Rule

 Applies to intersection of events.


 If in an experiment, the events A and B can both occur provided 𝑃(𝐴) > 0, then

𝑷(𝑨 ∩ 𝑩) = 𝑷(𝑨)𝑷(𝑩|𝑨)

 The probability that events A, B, and C will occur is given by:

𝑷(𝑨 ∩ 𝑩 ∩ 𝑪) = 𝑷(𝑨)𝑷(𝑩|𝑨) + 𝑷(𝑪|𝑨 ∩ 𝑩)

EXAMPLE 2.2: The probability that the first stage of a numerically controlled machining operation for high-rpm
pistons meets specifications is 0.90. Failures are due to metal variations, fixture alignment, cutting blade
condition, vibration, and ambient environmental conditions. Given that the first stage meets specifications, the
probability that a second stage of machining meets specifications is 0.95. What is the probability that both stages
meet specifications?

Answer: Let A and B denote the events that the first and second stages meet specifications, respectively. The
probability requested is
𝑃(𝐴 ∩ 𝐵) = 𝑃(𝐴)𝑃(𝐵|𝐴)= 0.95(0.90) = 0.855

TOPIC 3: Random Variable


 We often summarize the outcome from a random experiment by a simple number. In many of the
examples of random experiments that we have considered, the sample space has been a description of
possible outcomes. In some cases, descriptions of outcomes are sufficient, but in other cases, it is useful
to associate a number with each outcome in the sample space. Because the particular outcome of the
experiment is not known in advance, the resulting value of our variable is not known in advance. For this
reason, the variable that associates a number with the outcome of a random experiment is referred to as
a random variable.

 A random variable is a function that assigns a real number to each outcome in the sample space of a
random experiment.

 Notation is used to distinguish between a random variable and the real number.

 A random variable is denoted by an uppercase letter such as X. After an experiment is conducted, the
measured value of the random variable is denoted by a lowercase letter such as x = 70 mill amperes.

 A discrete random variable is a random variable with a finite (or countable infinite) range.

Examples of discrete random variables: number of scratches on a surface, proportion of defective

 A continuous random variable is a random variable with an interval (either finite or infinite) of real
numbers for its range.

Examples of continuous random variables: electrical current, length, pressure, temperature, time,
voltage

TOPIC 4: Probability Distributions and Probability Mass Functions


Probability Distribution

This module is a property of Saint Joseph Institute of Technology (SJIT). The term module refers to an instructional material that focuses on a specific course. Details
and activities vary according to specific content and focus on student-centered learning activities.
No part of this module may be reproduced or transmitted in any form or any means without approval of the CEO/President.
 The probability distribution of a discrete random variable is a graph, table or formula that specifies the
probability associated with each possible value the random variable can assume.

EXAMPLE 4.1: There is a chance that a bit transmitted through a digital transmission channel is received in
error. Let X equal the number of bits in error in the next four bits transmitted. The possible values for X are {0, 1,
2, 3, and 4}. Based on a model for the errors that is presented in the following section, probabilities for these
values will be determined. Suppose that the probabilities are
𝑃(𝑋 = 0) = 0.6561
𝑃(𝑋 = 1) = 0.2916
𝑃(𝑋 = 2) = 0.0486
𝑃(𝑋 = 3) = 0.0036
𝑃(𝑋 = 4) = 0.0001

The probability distribution of X is specified by the possible values along with the probability of each. A graphical
description of the probability distribution of X is shown in Fig. 4

Figure 4. Probability distribution for bits in error

Practical Interpretation: A random experiment can often be summarized with a random variable and its
distribution.
The details of the sample space can often be omitted.

Probability Mass Function

 For a discrete random variable X with possible values x1 , x2 ,… ,xn, a probability mass function is a
function such that

EXAMPLE 4.2: For the bits in error in Example 3.1, f (0) = 0.6561, f (1) = 0.2916, f (2) = 0.0486, f (3) =
0.0036, and f (4) = 0.0001. Check that the probabilities sum to 1.

This module is a property of Saint Joseph Institute of Technology (SJIT). The term module refers to an instructional material that focuses on a specific course. Details
and activities vary according to specific content and focus on student-centered learning activities.
No part of this module may be reproduced or transmitted in any form or any means without approval of the CEO/President.

You might also like