Finals (MS)
Finals (MS)
PROBABILITY » the chance that something will happen OF STATISTICAL INDEPENDENCE the random variable
• Event » set of outcomes from an experiment • Marginal Probability » simple probability of the occurrence » where the weights are the probabilities associated with values
• Experiment » any procedure that can be infinitely repeated of an event 𝐸(𝑥) = μ = Σ𝑥𝑓(𝑥)
• Sample Space » all the possible outcomes of an event • Joint Probability » probability that two or more independent • Variance » a measure used to summarize the variability in
• Sample Points » each outcome in a sample space events will occur together or in succession the values of a random variable
• Mutually Exclusive Events » events that cannot occur at the 𝑉𝑎𝑟(𝑥) = σ2 = Σ(𝑥 − μ)2 𝑓(𝑥)
same time 𝑃(𝐴𝐵) = 𝑃(𝐴) 𝑋 𝑃(𝐵) • Standard Deviation » the positive square root of the variance
Where: P(AB) = probability of event A & B occurring
Three Types of Probability together or in succession • Binomial Probability Distribution » probability distribution
1. Classical Approach P(A) = marginal probability of event A occurring associated with the random variable that represents
𝑃𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 𝑜𝑓 𝑎𝑛 𝑒𝑣𝑒𝑛𝑡 = 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 𝑓𝑎𝑣𝑜𝑟𝑎𝑏𝑙𝑒 𝑡𝑜 P(B) = marginal probability of event B occurring number of outcomes labeled success in the n trials.
𝑡ℎ𝑒 𝑜𝑐𝑐𝑢𝑟𝑟𝑒𝑛𝑐𝑒 𝑜𝑓 𝑡ℎ𝑒 𝑒𝑣𝑒𝑛𝑡 x = 0,1,…,n
÷ 𝑡𝑜𝑡𝑎𝑙 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑝𝑜𝑠𝑠𝑖𝑏𝑙𝑒 𝑜𝑢𝑡𝑐𝑜𝑚𝑒𝑠 CONDITIONAL PROBABILITY
2. Relative Frequency Approach » probability of one event when another related event Where: n = number of trials
(a) proportion of times that an event occurs in the long run is known to have occurred p = probability of success on one trial
when conditions are stable; or x = number of successes in n trails
(b) observed relative frequency of an event in a f(x) = probability of x successes in n trials
very large number of trials → the term n! in the preceding expression is referred to as n
3. Subjective Approach MULTIPLICATION LAW factorial and is defined as 𝑛! = 𝑛(𝑛 − 1)(𝑛 − 2)…(2)(1)
(a) based on personal belief or feelings » used to find the probability of an intersection of two events → the special case of zero factorial is 0! = 1
» derived from the definition of conditional probability
Probability Rules 𝑃 (𝐴 ∩ 𝐵) = 𝑃 (𝐴𝐼𝐵) 𝑃 (𝐵) Expected Value and Variance for the Binomial Distribution
1. One event or another will occur 𝑃 (𝐴 ∩ 𝐵) = 𝑃 (𝐵𝐼𝐴) 𝑃 (𝐴) 𝐸(𝑥) = μ = Σ𝑥𝑓(𝑥) or μ = 𝑛𝑝
2. Two or more events will all occur for the special case of a binomial distribution:
SPECIAL CASE OF INDEPENDENT EVENTS the variance of the random variable is σ2 = 𝑛𝑝(1 − 𝑝)
» events whose occurrence is not dependent on any other event • Poisson Probability Distribution » a discrete random variable
𝑃(𝐴∩𝐵) = 𝑃(𝐴)𝑃(𝐵) that often is useful when we are dealing with the number
of occurrences of an event over a specified
BAYES’ THEOREM interval of time or space.
» provides a means for making revisions from initial or prior
𝑃(𝐴) + 𝑃(𝐴c) = 1 probability estimates for specific events of interest, Applicability of Poisson Probability Distribution
to posterior probability after new information was obtained 1. The probability of an occurrence of the event is the
ADDITION LAW same for any two intervals of equal length
» useful relationship when we have two events and are interested Prior New Application Posterior 2. The occurrence or non-occurrence of the event in
in knowing the probability that at least one of the events occurs Probabilities Information of Bayes’ Probabilities any interval is independent of the occurrence
• union of events A & B » event containing all sample points Theorem or non-occurrence in any other interval
belonging to A or B or both 𝑃(𝐴∩𝐵) = 𝑃(𝐵)𝑃(𝐴𝐼𝐵)
»A∪B for x = 0,1,2…..
• intersection of events A and B » event containing the sample Where:
points belonging to both A & B λ = mean or average number of occurrences in an interval 𝑥
»A∩B RANDOM VARIABLE e = 2.71828
» numeric description of the outcome of an experiment x = number of occurrences in the interval
f(x) = probability of x occurrences in the interval
ex: experiment of selling automobiles for one day at a particular
dealership, we could describe the experimental outcomes in Normal Probability Distribution
terms of the number of cars sold » the most important probability distribution used to describe a
x = number of cars sold, x is called a random variable continuous random variable
• Standard Normal Distribution » random variable that has:
𝑃(𝐴 𝐵) = 𝑃(𝐴) + 𝑃(𝐵) − 𝑃(𝐴 𝐵) Classification of Random Variable → a normal distribution
a. Discrete Random Variables » random variable that may assume → with a mean of 0
only a finite or an infinite sequence (e.g., 1, 2, 3, . . .) of values → with a standard deviation of 1
ADDITION LAW FOR MUTUALLY EXCLUSIVE EVENTS b. Continuous Random Variables » random variable that may » use letter z to designate this particular normal random variable
𝑃(𝐴 𝐵) = 𝑃(𝐴) + 𝑃(𝐵) assume any value in a certain interval or collection of
intervals such as weight, time, & temperature
FORECAST ACCURACY • Exponential Smoothing » it is a special case of the weighted
FORECAST ERROR » key concept associated with measuring moving averages method in which we select only one weight—
forecast accuracy the weight for the most recent observation
forecast accuracy formula: 𝑒𝑡 = 𝑌𝑡 − 𝐹t
Standard Normal Distribution Two Normal Distributions With μ = 50
where: et = Forecast Error
• Probabilities for Any Normal Distribution 𝑌t = Actual Value where: 𝐹𝑡+1 = forecast of the time series for period t+1
» formula used to convert any normal random variable x with 𝐹t = Forecast 𝑌𝑖 = actual value of the time series in period t
mean µ and standard deviation σ to the Forecast Accuracy Measures 𝐹𝑡 = forecast of the time series for period t
standard normal distribution: α = smoothing constant (0 < α < 1)