50% found this document useful (2 votes)
2K views5 pages

Chapter 13 Mathematics - Class 12 - Formula - Sheet

The document defines key probability concepts such as events, equally likely events, mutually exclusive events, exhaustive events, and complements of events. It also discusses probability of an event, conditional probability, independent events, Bayes' theorem, probability distributions including the binomial distribution, and key properties of distributions like mean and variance. Events, probabilities, random variables, and distributions are fundamental concepts in probability theory.

Uploaded by

Aditya Sharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
50% found this document useful (2 votes)
2K views5 pages

Chapter 13 Mathematics - Class 12 - Formula - Sheet

The document defines key probability concepts such as events, equally likely events, mutually exclusive events, exhaustive events, and complements of events. It also discusses probability of an event, conditional probability, independent events, Bayes' theorem, probability distributions including the binomial distribution, and key properties of distributions like mean and variance. Events, probabilities, random variables, and distributions are fundamental concepts in probability theory.

Uploaded by

Aditya Sharma
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Event Definition: An event, within the context of probability, is a subset of a

sample space that corresponds to a random experiment. For instance, when


tossing a coin, the occurrence of a head or a tail constitutes an event.

Equally Likely Events: Events are considered equally likely when there is no
inherent preference for the occurrence of one over the others. An example of this is
rolling an unbiased die, where each of the six faces has an equal chance of landing
face up.

Mutually Exclusive Events: Events are mutually exclusive when the occurrence of
one event excludes the possibility of the others occurring simultaneously. For
example, when a die is rolled, the six outcomes (1 through 6) are mutually
exclusive since the roll of a specific number eliminates the occurrence of the other
five.

Exhaustive Events: A collection of events is exhaustive when the performance of


an experiment guarantees the occurrence of at least one event from the set. In the
scenario of rolling two dice, the exhaustive outcomes total 36 (6^2), encompassing
all possible combinations of the two dice rolls.

Complement of an Event: The complement of an event A in a sample space S


includes all possible outcomes in S that are not part of A. This is denoted by A’ or
A¯ and mathematically represented as A’ = {n : n ∈ S, n ∉ A}.

Additional Notes:

 An experiment refers to any operation yielding well-defined outcomes.


 A random experiment is characterized by variability in outcomes, even when
the experiment is repeated under identical conditions.

Probability of an Event
If a trial result is n exhaustive, mutually exclusive and equally likely cases and m
of them are favourable to the happening of an event A, then the probability of
happening of A is given by
P(A) = m / n
Where P(A) is the probability of event A, m is the number of favorable outcomes,
and n is the total number of exhaustive, mutually exclusive outcomes.

Note:
(i) 0 ≤ P(A) ≤ 1
(ii) Probability of an impossible event is zero.
(iii) Probability of certain event (possible event) is 1.
(iv) P(A ∪ A’) = P(S)
(v) P(A ∩ A’) = P(Φ)
(vi) P(A’)’ = P(A)
(vii) P(A ∪ B) = P(A) + P(B) – P(A ∩ S)

Conditional Probability: Let E and F be two events associated with the


same sample space of a random experiment. Then, probability of occurrence
of event E, when the event F has already occurred, is called a conditional
probability of event E over F and is denoted by P(E/F).

Properties of Conditional Probability: If E and E are two events of sample space


S and G is an event of S which has already occurred such that P(G) ≠ 0, then
(i) P[(E ∪ F)/G] = P(F/G) + P(F/G) – P[(F ∩ F)/G], P(G) ≠ 0
(ii) P[(E ∪ F)/G] = P(F/G) + P(F/G), if E and F are disjoint events.
(iii) P(F’/G) = 1 – P(F/G)
(iv) P(S/E) = P(E/E) = 1

Multiplication Theorem: If E and F are two events associated with a sample space
S, then the probability of simultaneous occurrence of the events E and F is
P(E ∩ F) = P(E) . P(F/E), where P(F) ≠ 0
or
P(E ∩ F) = P(F) . P(F/F), where P(F) ≠ 0
This result is known as multiplication rule of probability.

Independent Events: Two events E and F are said to be independent, if probability


of occurrence or non-occurrence of one of the events is not affected by that of the
other. For any two independent events E and F, we have the relation
(i) P(E ∩ F) = P(F) . P(F)
(ii) P(F/F) = P(F), P(F) ≠ 0
(iii) P(F/F) = P(F), P(F) ≠ 0
Also, their complements are independent events,
P(E’ ∩ F’) = P(E’)*P(F’)

Note: If E and F are dependent events, then P(E ∩ F) ≠ P(F) . P(F).

Three events E, F and G are said to be mutually independent, if


(i) P(E ∩ F) = P(E) . P(F)
(ii) P(F ∩ G) = P(F) . P(G)
(iii) P(E ∩ G) = P(E) . P(G)
(iv)P(E ∩ F ∩ G) = P(E) . P(F) . P(G)
If atleast one of the above is not true for three given events, then we say that
the events are not independent.
Note: Independent and mutually exclusive events do not have the same
meaning.

Baye’s Theorem and Probability Distributions


Partition of Sample Space: A set of events E1, E2,…,En is said to represent a partition
of the sample space S, if it satisfies the following conditions:
(i) Ei ∩ Ej = Φ; i ≠ j; i, j = 1, 2, …….. n
(ii) E1 ∪ E2 ∪ …… ∪ En = S
(iii) P(Ei) > 0, ∀ i = 1, 2,…, n

Probability Distributions:
A probability distribution is a framework that outlines the possible values of a
random variable and their associated probabilities. It serves as a comprehensive
representation of all outcomes of a random variable and the likelihood of each
outcome occurring.

Probability Distribution:
Random Variable (X): A random variable is a numerical description of the outcome
of a random experiment. It can assume multiple values, represented as x1, x2, …,
xn, each corresponding to a specific outcome of the experiment.
Associated Probabilities: Alongside each possible value of the random variable,
there is a probability, denoted as p1, p2, …, pn. These probabilities quantify the
chance of each value occurring and satisfy two key conditions:
Each individual probability is between 0 and 1, inclusive (0 ≤ pi ≤ 1 for all i).
The sum of all probabilities equals 1 (∑pi = 1).
Probability Distribution Table:
A probability distribution table is a structured way of listing all possible values of
the random variable along with their corresponding probabilities. It is typically set
up as follows:

Random Variable Probability


(X) (P)
x1 p1
x2 p2
... ...
xn pn

Mean and Variance of a Probability Distribution

The mean (or expected value) and variance of a probability distribution are
two fundamental statistical measures that provide significant insights into
the distribution's characteristics. Here’s an overview of how each is defined
and calculated:

Mean (μ) = (x1 * p1) + (x2 * p2) + ... + (xn*pn)

where the sum is taken over all possible values of X.

Variance of a Probability Distribution

The variance of a probability distribution, denoted by σ^2, measures the


spread or dispersion of the distribution around the mean. It is a quantitative
expression of how much the values of the random variable vary from the
mean (expected value).

The variance is calculated as the sum of the squared difference between


each possible value of the random variable and the mean, each multiplied
by its corresponding probability:

Variance (σ2) = [(x1 - μ)2 * p1] + [(x2 - μ)2 * p2] + ... + [(xn - μ)2 * pn]

where μ is the mean of the distribution, and the sum is taken over all
possible values of X.

Simplified Explanation

Mean: The weighted average of all possible outcomes, where each outcome
is weighted by its probability. It gives an idea of the central tendency or
expected result of the probability distribution.

Variance: An indicator of how spread out or concentrated the values are


around the mean. A higher variance means the values are spread out over a
wider range, while a lower variance indicates they are more closely bunched
together around the mean.

Together, the mean and variance offer a comprehensive snapshot of a


probability distribution's behavior, highlighting both the central tendency
and the variability of the random variable's outcomes.

Mean of a Probability Distribution

The mean of a probability distribution, often denoted by μ, represents the


average or expected value of the random variable. It is calculated as the sum
of each possible value of the random variable multiplied by its
corresponding probability.
If X is a random variable with possible values x1, x2, …, xn, and these
values have probabilities p1, p2, …, pn respectively, then the mean (μ) is
given by:

Bernoulli Trial: Trials of a random experiment are called Bernoulli trials if


they satisfy the following conditions:
(i) There should be a finite number of trials.
(ii) The trials should be independent.
(iii) Each trial has exactly two outcomes, success or failure.
(iv) The probability of success remains the same in each trial.

Binomial Distribution: The probability distribution of numbers of


successes in an experiment consisting of n Bernoulli trials obtained by the
binomial expansion (p + q)n, is called binomial distribution.
Let X be a random variable which can take n values x1, x2,…, xn. Then, by
binomial distribution, we have P(X = r) = nCr prqn-r
where,
n = Total number of trials in an experiment
p = Probability of success in one trial
q = Probability of failure in one trial
r = Number of success trial in an experiment
Also, p + q = 1
Binomial distribution of the number of successes X can be represented as

Mean and Variance of Binomial Distribution


(i) Mean(μ) = Σ xipi = np
(ii) Variance(σ2) = Σ xi2 pi – μ2 = npq
(iii) Standard deviation (σ) = √Variance = √npq
Note: Mean > Variance

You might also like