0% found this document useful (0 votes)
23 views3 pages

Finals (MS)

The document covers fundamental concepts of probability, including definitions of events, sample spaces, and types of probability such as classical, relative frequency, and subjective approaches. It also discusses key statistical measures like expected value, variance, and standard deviation, along with various probability distributions including binomial and Poisson distributions. Additionally, it addresses forecasting methods, decision-making under uncertainty and risk, and criteria for evaluating decisions based on expected values.

Uploaded by

StudentNumber007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views3 pages

Finals (MS)

The document covers fundamental concepts of probability, including definitions of events, sample spaces, and types of probability such as classical, relative frequency, and subjective approaches. It also discusses key statistical measures like expected value, variance, and standard deviation, along with various probability distributions including binomial and Poisson distributions. Additionally, it addresses forecasting methods, decision-making under uncertainty and risk, and criteria for evaluating decisions based on expected values.

Uploaded by

StudentNumber007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

UNIT 4: PROBABILITY CONCEPTS PROBABILITIES UNDER CONDITIONS • Expected Value » weighted average of all possible values of

PROBABILITY » the chance that something will happen OF STATISTICAL INDEPENDENCE the random variable
• Event » set of outcomes from an experiment • Marginal Probability » simple probability of the occurrence » where the weights are the probabilities associated with values
• Experiment » any procedure that can be infinitely repeated of an event 𝐸(𝑥) = μ = Σ𝑥𝑓(𝑥)
• Sample Space » all the possible outcomes of an event • Joint Probability » probability that two or more independent • Variance » a measure used to summarize the variability in
• Sample Points » each outcome in a sample space events will occur together or in succession the values of a random variable
• Mutually Exclusive Events » events that cannot occur at the 𝑉𝑎𝑟(𝑥) = σ2 = Σ(𝑥 − μ)2 𝑓(𝑥)
same time 𝑃(𝐴𝐵) = 𝑃(𝐴) 𝑋 𝑃(𝐵) • Standard Deviation » the positive square root of the variance
Where: P(AB) = probability of event A & B occurring
Three Types of Probability together or in succession • Binomial Probability Distribution » probability distribution
1. Classical Approach P(A) = marginal probability of event A occurring associated with the random variable that represents
𝑃𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑎𝑛 𝑒𝑣𝑒𝑛𝑡 = 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 𝑓𝑎𝑣𝑜𝑟𝑎𝑏𝑙𝑒 𝑡𝑜 P(B) = marginal probability of event B occurring number of outcomes labeled success in the n trials.
𝑡ℎ𝑒 𝑜𝑐𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑜𝑓 𝑡ℎ𝑒 𝑒𝑣𝑒𝑛𝑡 x = 0,1,…,n
÷ 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑝𝑜𝑠𝑠𝑖𝑏𝑙𝑒 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 CONDITIONAL PROBABILITY
2. Relative Frequency Approach » probability of one event when another related event Where: n = number of trials
(a) proportion of times that an event occurs in the long run is known to have occurred p = probability of success on one trial
when conditions are stable; or x = number of successes in n trails
(b) observed relative frequency of an event in a f(x) = probability of x successes in n trials
very large number of trials → the term n! in the preceding expression is referred to as n
3. Subjective Approach MULTIPLICATION LAW factorial and is defined as 𝑛! = 𝑛(𝑛 − 1)(𝑛 − 2)…(2)(1)
(a) based on personal belief or feelings » used to find the probability of an intersection of two events → the special case of zero factorial is 0! = 1
» derived from the definition of conditional probability
Probability Rules 𝑃 (𝐴 ∩ 𝐵) = 𝑃 (𝐴𝐼𝐵) 𝑃 (𝐵) Expected Value and Variance for the Binomial Distribution
1. One event or another will occur 𝑃 (𝐴 ∩ 𝐵) = 𝑃 (𝐵𝐼𝐴) 𝑃 (𝐴) 𝐸(𝑥) = μ = Σ𝑥𝑓(𝑥) or μ = 𝑛𝑝
2. Two or more events will all occur for the special case of a binomial distribution:
SPECIAL CASE OF INDEPENDENT EVENTS the variance of the random variable is σ2 = 𝑛𝑝(1 − 𝑝)
» events whose occurrence is not dependent on any other event • Poisson Probability Distribution » a discrete random variable
𝑃(𝐴∩𝐵) = 𝑃(𝐴)𝑃(𝐵) that often is useful when we are dealing with the number
of occurrences of an event over a specified
BAYES’ THEOREM interval of time or space.
» provides a means for making revisions from initial or prior
𝑃(𝐴) + 𝑃(𝐴c) = 1 probability estimates for specific events of interest, Applicability of Poisson Probability Distribution
to posterior probability after new information was obtained 1. The probability of an occurrence of the event is the
ADDITION LAW same for any two intervals of equal length
» useful relationship when we have two events and are interested Prior New Application Posterior 2. The occurrence or non-occurrence of the event in
in knowing the probability that at least one of the events occurs Probabilities Information of Bayes’ Probabilities any interval is independent of the occurrence
• union of events A & B » event containing all sample points Theorem or non-occurrence in any other interval
belonging to A or B or both 𝑃(𝐴∩𝐵) = 𝑃(𝐵)𝑃(𝐴𝐼𝐵)
»A∪B for x = 0,1,2…..
• intersection of events A and B » event containing the sample Where:
points belonging to both A & B λ = mean or average number of occurrences in an interval 𝑥
»A∩B RANDOM VARIABLE e = 2.71828
» numeric description of the outcome of an experiment x = number of occurrences in the interval
f(x) = probability of x occurrences in the interval
ex: experiment of selling automobiles for one day at a particular
dealership, we could describe the experimental outcomes in Normal Probability Distribution
terms of the number of cars sold » the most important probability distribution used to describe a
x = number of cars sold, x is called a random variable continuous random variable
• Standard Normal Distribution » random variable that has:
𝑃(𝐴 𝐵) = 𝑃(𝐴) + 𝑃(𝐵) − 𝑃(𝐴 𝐵) Classification of Random Variable → a normal distribution
a. Discrete Random Variables » random variable that may assume → with a mean of 0
only a finite or an infinite sequence (e.g., 1, 2, 3, . . .) of values → with a standard deviation of 1
ADDITION LAW FOR MUTUALLY EXCLUSIVE EVENTS b. Continuous Random Variables » random variable that may » use letter z to designate this particular normal random variable
𝑃(𝐴 𝐵) = 𝑃(𝐴) + 𝑃(𝐵) assume any value in a certain interval or collection of
intervals such as weight, time, & temperature
FORECAST ACCURACY • Exponential Smoothing » it is a special case of the weighted
FORECAST ERROR » key concept associated with measuring moving averages method in which we select only one weight—
forecast accuracy the weight for the most recent observation
forecast accuracy formula: 𝑒𝑡 = 𝑌𝑡 − 𝐹t
Standard Normal Distribution Two Normal Distributions With μ = 50
where: et = Forecast Error
• Probabilities for Any Normal Distribution 𝑌t = Actual Value where: 𝐹𝑡+1 = forecast of the time series for period t+1
» formula used to convert any normal random variable x with 𝐹t = Forecast 𝑌𝑖 = actual value of the time series in period t
mean µ and standard deviation σ to the Forecast Accuracy Measures 𝐹𝑡 = forecast of the time series for period t
standard normal distribution: α = smoothing constant (0 < α < 1)

• Exponential Probability Distribution LINEAR TREND PROJECTION


» a continuous probability distribution that is often useful in REGRESSION ANALYSIS » may be used to forecast a time series
describing the time needed to complete a task with a linear trend
Simple Linear Regression — an estimate of a linear relationship
between the dependent variable (which is usually
denoted as y) and a single independent variable
(which is usually denoted as x)
Ft = b0 + b1t
UNIT 5: TIME SERIES AND FORECASTING where: t = the time period
Management decisions depend on forecasts. where: Ft = linear trend forecast in period t
(ex: they study sales forecasts, to make decisions on working 𝑒𝑡 = Forecast Error (i.e., the estimated value of Yt in period t)
capital needs, the size of workforce, inventory levels, n = total number of periods b0 = the Y-intercept of the linear trendline
and many other management problems.) k = number of periods which we cannot produce a forecast b1 = the slope of the linear trendline
Yt = Actual Value
FORECASTING » can be classified as qualitative or quantitative
» Qualitative Methods − generally involve the use of expert TECHNIQUES OF FORECASTING
judgment to develop forecasts NAIVE FORECAST » forecast for any period equals the previous
− such methods are appropriate when historical data period’s actual value where:
on the variable being forecast are either not applicable AVERAGING TECHNIQUES » a technique that is capable of
or unavailable adapting well to changes in the level of a horizontal pattern
» however, without modification they are not appropriate when
» Quantitative Forecasting Methods − when past information considerable trend, cyclical, or seasonal effects are present
about the variable being forecast is available and can be
quantified and can assume that the pattern of the past • Moving Averages » uses the average of the most recent k data
will continue into the future values in the time series as the forecast for the next period
QUANTITATIVE METHODS OF FORECASTING:
a. TIME SERIES PATTERNS » sequence of observations on a SEASONALITY
variable measured at successive points in time or over • we can obtain the quarterly forecasts for next year by simply
successive periods of time computing the average number of items sold in each quarter
● Horizontal Pattern – exists when the data fluctuate randomly • we can model a time series with a seasonal pattern by treating
around a constant mean over time where: 𝐹𝑡+1 = forecast of the time series for period t+1 season as a categorical variable where dummy variable = k – 1
● Trend Pattern –shows gradual shifts or movements to 𝑌𝑖 = actual value of the time series in period t (ex: if there are four seasons, we need three dummy variables)
relatively higher or lower values over a longer period k = number of periods of time series data used to
● Seasonal Pattern – short-term regular variations in data generate the forecast
● Cyclical Pattern – wavelike variations of more than
one year’s duration • Weighted Moving Averages » involves selecting a different
weight for each data value in the moving average and then
SELECTING A FORECASTING METHOD computing a weighted average of the most recent k values
TIME SERIES PLOT » should be one of the first things developed as the forecast
when identifying which forecasting method to be used F𝑡+1 = 𝑤𝑡𝑌𝑡 + 𝑤𝑡−1𝑌𝑡−1 + … + 𝑤𝑡−𝑘+1𝑌𝑡−𝑘+1
» if we see a specific pattern, then we need to select a where: F𝑡+1 = forecast of the time series for period 𝑡+1
method appropriate for this type of pattern Y𝑖 = actual value of the time series in period t
» similarly, if we observe a trend in the data, then we need to use 𝑤𝑡 = weight applied to actual time series value for period t
a forecasting method that can handle trends effectively k = number of periods of time series data used to
generate the forecast
UNIT 6: DECISION ANALYSIS USING PROBABILITIES Maximum Likelihood Criterion » selecting state of nature that has
highest probability of occurrence
STEPS IN DECISION MAKING » then, having assumed that this state will occur, will pick the
1. list all the viable alternatives that must be considered decision alternative which will yield the highest payoff
in the decision
2. identify the future events that may occur USING THE EXPECTED VALUE CRITERION WITH CONTINUOUSLY
3. Construct a payoff table » shows the payoffs (expressed in DISTRIBUTED RANDOM VARIABLES
profits or any other measure of benefit which is • Supplying the Numbers – generating the values required using
appropriate to the situation) which would result from an intuitive approach
each possible combination of decision alternative and • Combining Experience and Numbers – combining intuition and
state of nature statistical evidence

Different Environments in Which Decisions are Made


1. Decision making under conditions of certainty
2. Decision making under conditions of uncertainty
3. Decision making under conditions of risk

Criteria for Decision Making Under Uncertainty


1. Maximax Criterion • Utility as a Decision Criterion – use of shortcomings of
2. Maximin Criterion expected value as a decision criterion
3. Minimax Regret Criterion

Criterion of Realism (Middle-Ground Criterion)


Measure of Realism = α (maximum payoff) +
(1 – α) (minimum payoff)

DECISION MAKING UNDER CONDITIONS OF RISK:


DISCRETE RANDOM VARIABLES
when we make decisions under conditions of risk, we
need information that will enable us to provide probabilities for
the various possible states of nature
Three Criteria for Decision Making Under Risk
1. The Expected Value Criterion (Bayes criterion)
2. Criterion of rationality (principle of insufficient reason)
3. Criterion of maximum likelihood

• Expected Value Criterion – calculate the expected value for


each decision alternative (sum of weighted payoffs for that
alternative, where the weights are probability values assigned
by decision maker to the states of nature that can happen)
• Expected Profit with Perfect Information – complete & accurate
information about the future
Minimizing Expected Losses – choose the course of action
which will minimize the expected value of these losses
Two Types of Losses
1. Obsolescence Losses » caused by stocking too many units
2. Opportunity Losses » caused by being out of stock when
buyers want to buy
• Use of Marginal Analysis – marginal approach avoids this
excessive computational work
– when an additional unit of an item is bought,
2 possible outcomes: the unit will be sold or it will not be sold
𝑝(𝑀) = (1 − 𝑝) (𝑀𝐿)
Criterion of Rationality – the decision making where situations
show little or no data on past demand

You might also like