0% found this document useful (0 votes)
62 views7 pages

Sta 411 PDF

For students

Uploaded by

tytestluminoux
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
0% found this document useful (0 votes)
62 views7 pages

Sta 411 PDF

For students

Uploaded by

tytestluminoux
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 7
STA 411: PROBABILITY VI 3 UNITS Probability spaces, measures and distribution, Distribution of random variables as measurable functions. Product spaces; Products of measurable spaces, product probabilities. Independence and expectation of random variable. Convergence of random variables: Weak convergence ‘almost everywhere, convergence in p" mean. Central limit theorem, laws of large numbers. Characteristic function and Inversion formula. 1.0 _ Probability spaces, measures and distribution 1.1 PROBABILITY SPACE: A probability space consists of three components: a sample space (set of all possible outcomes) denoted as Q, an event space or o-algebra or o-field (collection of subsets of the sample space) denoted as F, and a probability measure (assigning probabilities to events) denoted as P. In other words, a probability space is the triple (02, FP). It provides a mathematical framework to model uncertainty and randomness in probability theory. 1.2 PROBABILITY MEASURE. A probability measure is a mathematical concept used in probability theory to describe the likelihood or probability of events occurring within a probability space. It assigns a real number between 0 and | to each event, with the total probability of all events in the sample space summing up to 1. Definition: Let © be a sample space, and 5 be a c-algebra of subsets of ©. A probability measure, denoted as P or Pr, is a function P: E —r (0, 1] with the following properties: : P(A)2> 0 forall A EZ. 1, where Q is the entire sample space. Countable Additivity: For any countable sequence of mutually exclusive (disjoint) events {An} in (ie., Ai Aj=@ for ij), we have: P(UA,) = EP(A.) for all such sequences. In simpler terms, a probability measure assigns probabilities to events in a way that ensures non- negativity, the entire sample space has probability 1 (certainty), and the probability of the union of disjoint events is the sum of their individual probabilities, For example, (1) Consider rolling a fair six-sided die. The probability space for this experiment can be defined 2s follows: Sample Space (Q): {1,2,3,4, 5, 6} (possible outcomes when rolling the die) Event Space or o-algebra (F): (0, {1}, {2}, 8). 4} (5), (6), (1,2), (1, He Sa 6}) (all possible subsets of the sample space) Probability Measure (P): P((i}) = 1/6 for each i in 2 (assuming a fair die) This probability space captures the randomness of rolling a die and assigns probabilities to different outcomes and events. 2) Coin Toss: Sample Space (2): {Heads, Tails} Event Space or o-algebra (F): (0, {Heads}, {Tails}, (Heads, Tails} } Probability Measure (P): P({Heads}) = P({Tails}) = 0.5 @) Weather Forecast: Sample Space (Q): {Sunny, Cloudy, Rainy} Event Space or o-algebra (F): {0, {Sunny}, {Cloudy}, {Rainy}, {Sunny, Cloudy}, (Sunny, Rainy}, {Cloudy, Rainy}, (Sunny, Cloudy, Rainy} } Probability Measure (P): Based on historical weather data or meteorological predictions. What is a o-algebra or o-ficld? A o-algebra (or o-field) denoted f, is a collection of subsets of a sample space that satisfies certain properties. It is a crucial concept in probability theory because it defines which subsets of the sample space are considered measurable, and thus, events for which probabilities can be assigned. Fora set, X, ao-algebra, F is a collection of subsets of X satisfying three properties: @) XisinF (ii) If A is in F, then the complement of A denoted as A° is also in F (iii) If Aj, Aa, As, ... is a countable sequence of sets in F, then the union is also in F. That is, if Ai, As, As,... € then U(A)er. ‘The elements of F often referred to as measurable sets, and the sigma-algebra serves as @ collection of subsets that are considered "measurable" in the context of probability. Measurable spaces are fundamental in probability theory because they provide a way to define events (subsets of the sample space) for which probabilities cam be assigned. Random vari ch are functions mapping outcomes to real numbers, are also defined within the framework sf measurable spaces. Tor example, if we have a probability space (,F,P), where @ is the sample space, F is the sigm-algebra of events, and Pis the probability measure, then (QF) forms a measurable space. PRODUCT SPACE In probability theory, « product space is formed when considering the outcomes of multiple independent random experiments. The product space allows us to model the combined outcomes of these experiments. Lets say we have two probability spaces (XFxx Px) and (XFy, Py ). The product space, denoted ‘as X*Y, is formed by taking the Cartesian product of the sample spaces: XxY={(xy)x EX, y € Y} ‘The sigma-algebra on the product space, denoted as XY, is constructed by taking the product sigma-algebra: Fey =o(Fx X Fy) Here, o(-) denotes the sigma-algebra generated by a set. The elements of Fray are sets of the form {(xy) € X*¥: x € A, y €B}, where A Fx and BEFy- If Px and Py are the probability measures on (XK, Fx) and (Y, Fy) respectively, then the product measure Pysy on X*Y is defined as: P yey (AxB) =Px (A):Py(B) for all Aefx and BEFy Product spaces are particularly useful when dealing with multiple independent random variables or experiments, providing a framework to analyze joint probabilities and outcomes, EXAMPLE 1: Consider two six-sided dice, and let X be the outcome of the first die, and Y be the outcome of the second die. The sample spaces are S, = {1, 2, 3, 4, 5, 6} and S, = {1, 2, 3, 4,5, 6} ‘The product space Sx x 5, is formed by taking the Cartesian product of the two sample spaces: S28, = (1,0),(1,2),--.(6,6)} yf 4 sigma-algebra on the product space, denoted as Ex.y , is constructed from the product ma-algebra. For simplicity, let's assume both dice are fair, and the product measure Pyxy assigns equal probabilities to cach outcome in the product space. For example, the probability of getting a sum of 7 ( A = {(1,6), (2,5), (3.4), (4,3), (5.2), 6,1)}) is calculated as: 1 e ad Pyey (A=Px {1.2 ee 5,6})-Py ({6,5,4,3,2,1}) = This illustrates the concept of a product space, where joint probabilities are determined based on the probabilities of individual events in each component space. EXAMPLE 2 Consider two fair six-sided dice rolls, and let X be the outcome of the first dic, and Y be the outcome of the second die, The sample spaces are Sx ={1,2,3,4,5,6} and Sy ={1,2,3,4,5,6}, respectively. Let A be the event of getting an even number on the first die (A={2,4,6}) and B be the event of getting a number greater than 4 on the second die (B={5,6}). ‘The joint probability of both events occurring is: Peoy (AXB)=Px(A)PHB)= 3.4 22 nic a wie This represents the probability of rolling an even number on the first die and a number greater than 4 on the second die. EXAMPLE 3 Consider a fair coin toss (Heads or Tails) and a fair six-sided die roll. Let X represent the outcome of the coin toss, and Y represent the outcome of the die roll. Sample Spaces: Sy ={Heads, Tails} and Sy ={1,2,3,4,5,6} Joint Event: AxB where A is getting Heads ({Heads} and B is getting an even number on the dic (B={2,4,6}). Joint Probability: x-y(AXB) = Py(A) Py (B) 2.0 RANDOM VARIABLE A random variable is a mathematical function that assigns a numerical value to each possible outcome of a random experiment or process. It provides a way to quantify and analyze uncertainty in the context of probability theory and statistics. ‘There are two types of random variables: discrete and continuous. Discrete Random Variable: This variable takes on a countable number of distinct values. Example: The number of heads in two coin tosses. Continuous Random Variable: ‘This variable takes on any value within a certain range. Example: The height of a person. Notation: Random variables are typically denoted by uppercase letters, such as X, Y, or Z. The behavior of a random variable is described by its probability distribution, which specifies the probabilities associated with each possible value. The probability distribution can be expressed using a probability mass function (for diserete random variables) or a probability density fanction (for continuous random variables). For example, if X represents the outcome of rolling a fair six-sided die, the random variable X can take values 1, 2, 3, 4,5, or 6, each with a probability of 1/6. The probability distribution of X is the set {1/6,1/6,1/6,1/6,1/6,1/6} for these six possible outcomes. EXPECTATION OF A RANDOM VARIABLE The expectation (or expected value) of a random variable is a measure of the "average" or "mean" value that one would expect the variable to take over many repetitions of the random experiment. It is a fundamental concept in probability theory and statistics. For a discrete random variable X with probability mass function P(X=x)) and corresponding values x,, the expectation E[X] is calculated as: a P(X=x,) ‘or a continuous random variable Y with probability density function fy the i is calculated as: aa a FUI= fr 0)4 Properties of Mathematical Expectations of Random Variables The mathematical expectation (or expected value) of a random variable has several properties that are important in probability theory and statistics. Let X be a random variable, and a and b be constants. The following properties hold for the expectation E[X]: 1. Linearity: E[aX+b]=a-E[X]+b, where a and b are constants. 2, Expectation of a function For discrete X: E[ ¢(X)] = Lets) P(X=x,) For continuous Y: E[¢(¥)] =|" a0). (0) where g(-) is any measurable function. 3. Expectation of a Constant Efe]-e ‘The expectation of a constant is the constant itself. 4. Sum of Random Variables: EDGY }-E[X}+E[Y] The expectation of the sum of two random variables is the sum of their individual expectations. 5, Expectation of a Constant Times a Random Variable: E[aX]=a-E[X] The expectation of a constant times a random variable is the constant multiplied by the expectation of the random variable. EXAMPLE

You might also like