0% found this document useful (0 votes)
113 views16 pages

Maths Project For 12th A

PROJECT IDEA FOR 12

Uploaded by

arjunajai888
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
113 views16 pages

Maths Project For 12th A

PROJECT IDEA FOR 12

Uploaded by

arjunajai888
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

OBJECTIVE

The objective of this project is to explore the fundamental concepts of


probability and its real-world applications. It aims to provide a clear
understanding of topics such as conditional probability, Bayes' theorem, and
probability distributions. Through theoretical explanations, practical
examples, and visual aids, this project seeks to enhance analytical skills and
demonstrate how probability helps in making informed decisions and
assessing risks in various fields.

1|P age
INTRODUCTION

Probability means possibility. It is a branch of mathematics that deals with the


occurrence of a random event. The value is expressed from zero to one.
Probability has been introduced in Maths to predict how likely events are to happen.

Probability is a measure of the likelihood of an event to occur. Many events cannot be


predicted with total certainty. We can predict only the chance of an event to occur i.e.
how likely they are to happen, using it. Probability can range in from 0 to 1, where 0
means the event to be an impossible one and 1 indicates a certain event. The
probability of all theevents in a sample space adds up to 1.

For example, when we toss a coin, either we get Head or Tail, only two possible
outcomes are possible (H, T). But if we toss two coins in the air, there could be three
possibilities of events to occur, such as both the coins show heads or both shows tails or
one shows heads and one tail, i.e. (H, H), (H, T), (T, T).
Probability theory is widely used in the area of studies such as statistics, finance,
gambling artificial intelligence, machine learning, computer science, game theory,
and philosophy.

2|P age
Historical Background of Probability Theory
Probability theory has a rich history that dates back centuries, evolving from simple
games of chance to a robust field of mathematics that influences a wide range of
disciplines today.
Early Beginnings
The earliest recorded ideas related to probability can be traced back to ancient
civilizations such as the Greeks and Romans, who engaged in games of chance using
dice and coins. However, these early societies did not develop a formal mathematical
approach to understanding probability; instead, they relied on superstitions and
empirical observations.
The Birth of Probability Theory
The formal study of probability began in the 17th century with a famous
correspondence between two French mathematicians, Blaise Pascal and Pierre de
Fermat. They were approached by a gambler named Chevalier de Méré, who sought
help in understanding and solving problems related to gambling. This collaboration,
known as the "Problem of Points," laid the groundwork for the development of
probability theory. The key problem was determining the fair division of stakes in an
interrupted game of chance.
Pascal and Fermat exchanged letters in 1654, discussing various problems related to
dice and games of chance. Their analysis led to the foundational concepts of expected
value and probability calculation. This work is often regarded as the starting point of
modern probability theory.
Development in the 18th and 19th Centuries
Following the work of Pascal and Fermat, the Swiss mathematician Jacob Bernoulli
made significant contributions with his book "Ars Conjectandi" (The Art of
Conjecture), published posthumously in 1713. In this work, Bernoulli introduced the
Law of Large Numbers, which states that as the number of trials increases, the
experimental probability converges to the theoretical probability.
The 18th century also saw contributions from the French mathematician Pierre-Simon
Laplace, who advanced the field with his work "Théorie Analytique des Probabilités"
(Analytical Theory of Probabilities) in 1812. Laplace introduced the Bayes' Theorem
concept and the principle of inverse probability, which laid the groundwork for
statistical inference.
Modern Probability Theory
In the early 20th century, the Russian mathematician Andrey Kolmogorov formalized
probability theory using a rigorous mathematical framework. In 1933, he published
"Foundations of the Theory of Probability", where he introduced the axiomatic

3|P age
approach to probability. Kolmogorov's work provided a solid foundation by defining
probability as a measure on a sample space and introducing key axioms that still serve
as the basis of probability theory today.

Kolmogorov's axioms are:


1. Non-negativity: The probability of any event is a non-negative number.
2. Normalization: The probability of the entire sample space is 1.
3. Additivity: For any mutually exclusive events, the probability of their union is the
sum of their probabilities.

This axiomatic approach unified various probabilistic concepts and paved the way for
further development in the fields of statistics, stochastic processes, and mathematical
analysis.

Impact on Other Fields


Today, probability theory is a cornerstone of many disciplines, including statistics,
finance, machine learning, artificial intelligence, and game theory. The pioneering
work of mathematicians like Pascal, Fermat, Bernoulli, Laplace, and Kolmogorov has
enabled us to understand and quantify uncertainty, making probability an essential tool
for decision-making in both academic research and real-world applications.

This historical journey of probability from a gambler’s curiosity to a rigorously


formalized field of study highlights its profound impact on the way we model and
interpret random events in everyday life.

4|P age
PROBABILITY OF AN EVENT
Assume an event ‘E’ can occur in ‘r’ ways out of a sum of ‘n’ probable or possible
equally likely ways. Then the probability of happening of the event or its success is
expressed as-
P(E) = r/n

The probability that the event will not occur or known as itsfailure is expressed as:
P(E’) = (n-r)/n = 1-(r/n)
E’ represents that the event will not occur. Therefore, now
we can say,
P(E) + P(E’) = 1
This means that the total of all the probabilities in anyrandom test or experiment is
equal to 1.

What are Equally Likely Events?


When the events have the same theoretical probability of happening, then they are
called equally likely events. The results of a sample space are called equally likely if
all of them have the same probability of occurring. For example, if you throw a die,
then the probability of getting 1 is 1/6.

Complementary Events
The possibility that there will be only two outcomes which states that an event will
occur or not. Like a person will come or not come to your house. Basically, the
complement of an event occurring in the exact opposite that the probability of it is not
occurring

5|P age
CONDITIONAL PROBABILITY

The conditional probability of ‘A’ given ‘B’ is the probability that event ‘A’ has
occurred in a trial of a random experiment for which it is known that event ‘B’ has
definitely occurred.
It may be computed by means of the following formula:P(A|B)=P(A∩B)/P(B)

Suppose a fair die has been rolled and you are asked to give the probability that it was
a five. There are six equally likely outcomes, so your answer is 1/6. But suppose that
before you give your answer you are given the extra information that the number rolled
was odd. Since there are only three odd numbers that are possible, one of which is five,
you would certainly revise your estimate of the likelihood that a five was rolled from
1/6 to 1/3.

6|P age
INDEPENDENT EVENT
In probability, two events are independent if the incidence ofone event does not affect
the probability of the other event. If the incidence of one event does affect the
probability ofthe other event, then the events are dependent.
There is a red 6-sided fair die and a blue 6-sided fair die. Both dice are rolled at the
same time. Let A be the event that the red die's result is even. Let B be the event that
the blue die's result is odd. The outcome of the red die has no impact on the outcome
of the blue die. Likewise, the outcome of the blue die does not affect the outcome of the
red die.
P(A)= 1/2 regardless of whether B happens or not. P(B)=1/2 regardless
of whether A happens or not. Therefore, the events are independent.
There are 3 green marbles and 5 blue marbles in a bag. Two marbles are drawn from
the bag at random. Let G be the event that the first marble drawn is green. Let B be the
eventthat the second marble drawn is blue.

Case 1: G happens
When the first marble drawn is green, there are 7 marbles left in the bag, and 5 of
them are blue. In this case, P(B)=5/7

Case 2: G does not happen


When the first marble drawn is blue, there are 7 marbles left in the bag, and 4 of them
are blue. In this case, P(B)= 4/7
The incidence of G affects the probability of B. Therefore, these events are not
independent. In other words, they are dependent.

7|P age
BAYE’S THEOREM

Baye’s Theorem Statement:


Let E1, E2,…,En be a set of events associated with a sample space S, where
all the events E1, E2,…, En have nonzero probability of occurrence and
they form a partition of S. Let A be any event associated with S, then
according to Baye’s theorem,
P(Ei│A) = P(Ei)P(A│Ei)/n∑k=1P(Ek)P(A|Ek)for any k
= 1, 2, 3, …., n

Baye’s Theorem Proof:


According to the conditional probability formula,

P(Ei│A) = P(Ei∩A)P(A) ⋯⋯⋯⋯⋯⋯⋯⋯(1)

Using the multiplication rule of probability,

P(Ei∩A) = P(Ei)P(A│Ei)⋯⋯⋯⋯⋯⋯⋯⋯(2)

Using total probability theorem,


P(A) = n∑k=1P(Ek)P(A|Ek)⋯⋯⋯⋯⋯⋯⋯⋯⋯⋯⋯(3)
Putting the values from equations (2) and (3) in equation 1,we get
P(Ei│A) = P(Ei)P(A│Ei)/n∑k=1 P(Ek)P(A|Ek)

8|P age
This diagram illustrates the concept of conditional probability P(A∣B) showing
the relationship between two events A and Bwith overlapping areas.

This visual representation helps explain Baye's Theorem formula, depicting


how P(A∣B) is calculated using P(B∣A),P(A) and P(B).

9|P age
RANDOM VARIABLES AND PROBABILITY DISTRIBUTION

A random variable is a real valued function whose domain is the sample


space of a random experiment.
The probability distribution for a random variable describes how the
probabilities are distributed over the values of the random variable. This
function provides the probability for each value of the random variable. The
probability distribution of a random variable X is the system of numbers
X: x1 x2 … xn
P(X): p1 p2 … pn
Where, pi>0, ∑𝑛 𝑝𝑖 = 1, i=1, 2, …, n
𝑖=1
Let X be a random variable whose possible values x1, x2, …, xk occur with
probabilities p1, p2, …, pk respectively, the mean of X is denoted by

The variance of a discrete random variable X measures the spread, or


variability, of the distribution, and is defined by

10 | P a g e
BERNOULLI TRIALS

A random experiment whose outcomes are only of two types, say success S
and failure F, is a Bernoulli trial. The probability of success is taken as p
while that of failure is q=1− p. A random variable X will have Bernoulli
distribution with probability p if its probability distribution is
P(X = x) = px (1 – p)1−x, for x = 0, 1 and P(X = x) = 0 for other values of x.
Here, 0 is failure and 1 is the success.

Conditions for Bernoulli Trials:


1. A finite number of trials.
2. Each trial should have exactly two outcomes: success orfailure.
3. Trials should be independent.
4. The probability of success or failure should be the same in each trial.

11 | P a g e
BINOMIAL DISTRIBUTION

Suppose a random experiment with exactly two outcomes is repeated n


times independently. The probability of success is p and that of failure is q.
Assume that out of these n times, we get success for x times and failure for
the remaining i.e., n−x times. The total number of ways in which we can
have success is nCx. A random variable X will have a binomial distribution
if-
P(X = x) = p(x) = nCx px qn-x,
for x = 0, 1, … , n and P(X = x) = 0 otherwise. Here, q=1– p. Any such
random variable X is binomial variate. A binomial trial is a set of n
independent Bernoullian trials.

Conditions for Binomial Distribution:


1. Each trial results in only two outcomes i.e., success andfailure.
2. The number of trials ‘n’ is finite.
3. The trials are independent of each other.
4. The probability of success, p or that of failure, q isconstant for
each trial.

12 | P a g e
Applications of Probability

Some of the applications of probability are predicting the outcome when


you:

 Flipping a coin.
 Choosing a card from the deck.
 Throwing a dice.
 Pulling a green candy from a bag of red candies.
 Winning a lottery 1 in many millions

Examples of Real-Life probability

Weather Planning:
A probability forecast is an assessment of how likely an event can occur in
terms of percentage and record the risks associated with weather.
Meteorologists around the world use different instruments and tools to
predict weather changes. They collect the weather forecast database from
around the world to estimate the temperature changes and probable weather
conditions for a particular hour, day, week,and month.

If there are 40 % chances of raining then the weather condition is such that
40 out of 100 days it has rained.

13 | P a g e
Sports Strategies:
In sports, analyses are conducted with the help of probability to understand
the strengths and weaknesses of a particular team or player. Analysts use
probability and odds to foretell outcomes regarding the team’s performance
and members inthe sport.
Coaches use probability as a tool to determine in what areas their team is
strong enough and in which all areas they have to work to attain victory.
Trainers even use probability to gauge the capacity of a particular player in
his team and when to allow him to play and against whom.
A cricket coach evaluates a player's batting and bowlingcapability by taking
his average performances in previous matches before placing him in the
line-up.
Insurance:
Insurance companies use the theory of probability or theoretical probability
for framing a policy or completing at a premium rate. The theory of
probability is a statistical method used to predict the possibility of future
outcomes.
Issuing health insurance for an alcoholic person is likely tobe more
expensive compared to the one issued to a healthy person. Statistical
analysis shows high health risks for aregular alcoholic person, ensuring
them is a great financial risk given a higher probability of serious illness
and hence filing a claim of premium money.
In Games:
Blackjack, poker, gambling, all sports, board games, video games use
probability to know how likely a team or person has chances to win.

14 | P a g e
CONCLUSION
Probability plays a pivotal role in understanding and managing uncertainty in
our everyday lives. It equips us with the tools to analyze random events, make
predictions, and make data-driven decisions. The insights gained from studying
probability enhance our ability to evaluate risks, optimize strategies, and
improve outcomes in a wide range of real-world applications. By mastering
these concepts, we can navigate complex situations with greater confidence and
accuracy.

This comprehensive exploration of probability has not only enhanced our


mathematical skills but also provided valuable perspectives on its real-world
implications, proving its importance as a fundamental tool in modern problem-
solving and decision-making.

The application of probability transcends the boundaries of mathematics and


statistics, reaching into various domains like engineering, economics, computer
science, and the natural sciences. This interdisciplinary nature highlights the
versatility of probability as a tool for solving complex problems.

15 | P a g e
BIBLIOGRAPHY

1. NCERT Mathematics – Textbook Class XII (Part – II)

2. Mathematics for Class XII - R S Aggarwal

3. Mathematics for Class XII – R D Sharma

4. Google Scholar Articles on Probability

5. Wikipedia

16 | P a g e

You might also like