0% found this document useful (0 votes)
294 views58 pages

Probability - Wikipedia

Probability is a branch of mathematics that deals with quantifying how likely events are to occur based on outcomes. The probability of an event is represented by a number between 0 and 1, with 0 indicating impossibility and 1 indicating certainty. There are two main interpretations of probability - objective probabilities assign numbers based on physical states or frequencies of outcomes, while subjective probabilities are based on degrees of belief. The mathematical study of probability began increasing in the 16th-17th centuries and was further developed through contributions made by thinkers like Pascal, Fermat, Bernoulli, and Laplace. Modern probability theory is based on axiomatic foundations and measure theory.

Uploaded by

Nani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
294 views58 pages

Probability - Wikipedia

Probability is a branch of mathematics that deals with quantifying how likely events are to occur based on outcomes. The probability of an event is represented by a number between 0 and 1, with 0 indicating impossibility and 1 indicating certainty. There are two main interpretations of probability - objective probabilities assign numbers based on physical states or frequencies of outcomes, while subjective probabilities are based on degrees of belief. The mathematical study of probability began increasing in the 16th-17th centuries and was further developed through contributions made by thinkers like Pascal, Fermat, Bernoulli, and Laplace. Modern probability theory is based on axiomatic foundations and measure theory.

Uploaded by

Nani
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 58

Probability

Probability is the branch of mathematics


concerning numerical descriptions of how
likely an event is to occur, or how likely it is
that a proposition is true. The probability
of an event is a number between 0 and 1,
where, roughly speaking, 0 indicates
impossibility of the event and 1 indicates
certainty.[note 1][1][2] The higher the
probability of an event, the more likely it is
that the event will occur. A simple example
is the tossing of a fair (unbiased) coin.
Since the coin is fair, the two outcomes
("heads" and "tails") are both equally
probable; the probability of "heads" equals
the probability of "tails"; and since no other
outcomes are possible, the probability of
either "heads" or "tails" is 1/2 (which could
also be written as 0.5 or 50%).

The probabilities of rolling several numbers using two


dice.
These concepts have been given an
axiomatic mathematical formalization in
probability theory, which is used widely in
areas of study such as statistics,
mathematics, science, finance, gambling,
artificial intelligence, machine learning,
computer science, game theory, and
philosophy to, for example, draw
inferences about the expected frequency
of events. Probability theory is also used
to describe the underlying mechanics and
regularities of complex systems.[3]

Interpretations
When dealing with experiments that are
random and well-defined in a purely
theoretical setting (like tossing a fair coin),
probabilities can be numerically described
by the number of desired outcomes,
divided by the total number of all
outcomes. For example, tossing a fair coin
twice will yield "head-head", "head-tail",
"tail-head", and "tail-tail" outcomes. The
probability of getting an outcome of "head-
head" is 1 out of 4 outcomes, or, in
numerical terms, 1/4, 0.25 or 25%.
However, when it comes to practical
application, there are two major competing
categories of probability interpretations,
whose adherents hold different views
about the fundamental nature of
probability:

1. Objectivists assign numbers to


describe some objective or physical
state of affairs. The most popular
version of objective probability is
frequentist probability, which claims
that the probability of a random event
denotes the relative frequency of
occurrence of an experiment's
outcome, when the experiment is
repeated indefinitely. This
interpretation considers probability to
be the relative frequency "in the long
run" of outcomes.[4] A modification of
this is propensity probability, which
interprets probability as the tendency
of some experiment to yield a certain
outcome, even if it is performed only
once.
2. Subjectivists assign numbers per
subjective probability, that is, as a
degree of belief.[5] The degree of
belief has been interpreted as "the
price at which you would buy or sell a
bet that pays 1 unit of utility if E, 0 if
not E."[6] The most popular version of
subjective probability is Bayesian
probability, which includes expert
knowledge as well as experimental
data to produce probabilities. The
expert knowledge is represented by
some (subjective) prior probability
distribution. These data are
incorporated in a likelihood function.
The product of the prior and the
likelihood, when normalized, results
in a posterior probability distribution
that incorporates all the information
known to date.[7] By Aumann's
agreement theorem, Bayesian agents
whose prior beliefs are similar will
end up with similar posterior beliefs.
However, sufficiently different priors
can lead to different conclusions,
regardless of how much information
the agents share.[8]
Etymology
The word probability derives from the Latin
probabilitas, which can also mean
"probity", a measure of the authority of a
witness in a legal case in Europe, and
often correlated with the witness's nobility.
In a sense, this differs much from the
modern meaning of probability, which in
contrast is a measure of the weight of
empirical evidence, and is arrived at from
inductive reasoning and statistical
inference.[9]

History
The scientific study of probability is a
modern development of mathematics.
Gambling shows that there has been an
interest in quantifying the ideas of
probability for millennia, but exact
mathematical descriptions arose much
later. There are reasons for the slow
development of the mathematics of
probability. Whereas games of chance
provided the impetus for the mathematical
study of probability, fundamental issues
are still obscured by the superstitions of
gamblers.[10]

According to Richard Jeffrey, "Before the


middle of the seventeenth century, the
term 'probable' (Latin probabilis) meant
approvable, and was applied in that sense,
univocally, to opinion and to action. A
probable action or opinion was one such
as sensible people would undertake or
hold, in the circumstances."[11] However, in
legal contexts especially, 'probable' could
also apply to propositions for which there
was good evidence.[12]

Al-Kindi's Book of Cryptographic Messages contains


the earliest known use of statistical inference (9th
century)
The earliest known forms of probability
and statistics were developed by Middle
Eastern mathematicians studying
cryptography between the 8th and 13th
centuries. Al-Khalil (717–786) wrote the
Book of Cryptographic Messages which
contains the first use of permutations and
combinations to list all possible Arabic
words with and without vowels. Al-Kindi
(801–873) made the earliest known use of
statistical inference in his work on
cryptanalysis and frequency analysis. An
important contribution of Ibn Adlan
(1187–1268) was on sample size for use
of frequency analysis.[13]
Gerolamo Cardano (16th century)

Christiaan Huygens published one of the first books


on probability (17th century)

The sixteenth-century Italian polymath


Gerolamo Cardano demonstrated the
efficacy of defining odds as the ratio of
favourable to unfavourable outcomes
(which implies that the probability of an
event is given by the ratio of favourable
outcomes to the total number of possible
outcomes[14]). Aside from the elementary
work by Cardano, the doctrine of
probabilities dates to the correspondence
of Pierre de Fermat and Blaise Pascal
(1654). Christiaan Huygens (1657) gave
the earliest known scientific treatment of
the subject.[15] Jakob Bernoulli's Ars
Conjectandi (posthumous, 1713) and
Abraham de Moivre's Doctrine of Chances
(1718) treated the subject as a branch of
mathematics.[16] See Ian Hacking's The
Emergence of Probability[9] and James
Franklin's The Science of Conjecture[17] for
histories of the early development of the
very concept of mathematical probability.

The theory of errors may be traced back to


Roger Cotes's Opera Miscellanea
(posthumous, 1722), but a memoir
prepared by Thomas Simpson in 1755
(printed 1756) first applied the theory to
the discussion of errors of observation.[18]
The reprint (1757) of this memoir lays
down the axioms that positive and
negative errors are equally probable, and
that certain assignable limits define the
range of all errors. Simpson also
discusses continuous errors and
describes a probability curve.

The first two laws of error that were


proposed both originated with Pierre-
Simon Laplace. The first law was
published in 1774, and stated that the
frequency of an error could be expressed
as an exponential function of the
numerical magnitude of the error—
disregarding sign. The second law of error
was proposed in 1778 by Laplace, and
stated that the frequency of the error is an
exponential function of the square of the
error.[19] The second law of error is called
the normal distribution or the Gauss law.
"It is difficult historically to attribute that
law to Gauss, who in spite of his well-
known precocity had probably not made
this discovery before he was two years
old."[19]

Daniel Bernoulli (1778) introduced the


principle of the maximum product of the
probabilities of a system of concurrent
errors.

Carl Friedrich Gauss


Adrien-Marie Legendre (1805) developed
the method of least squares, and
introduced it in his Nouvelles méthodes
pour la détermination des orbites des
comètes (New Methods for Determining the
Orbits of Comets).[20] In ignorance of
Legendre's contribution, an Irish-American
writer, Robert Adrain, editor of "The
Analyst" (1808), first deduced the law of
facility of error,

where is a constant depending on


precision of observation, and is a scale
factor ensuring that the area under the
curve equals 1. He gave two proofs, the
second being essentially the same as
John Herschel's (1850). Gauss gave the
first proof that seems to have been known
in Europe (the third after Adrain's) in 1809.
Further proofs were given by Laplace
(1810, 1812), Gauss (1823), James Ivory
(1825, 1826), Hagen (1837), Friedrich
Bessel (1838), W.F. Donkin (1844, 1856),
and Morgan Crofton (1870). Other
contributors were Ellis (1844), De Morgan
(1864), Glaisher (1872), and Giovanni
Schiaparelli (1875). Peters's (1856)
formula for r, the probable error of a single
observation, is well known.
In the nineteenth century, authors on the
general theory included Laplace, Sylvestre
Lacroix (1816), Littrow (1833), Adolphe
Quetelet (1853), Richard Dedekind (1860),
Helmert (1872), Hermann Laurent (1873),
Liagre, Didion and Karl Pearson. Augustus
De Morgan and George Boole improved
the exposition of the theory.

In 1906, Andrey Markov introduced[21] the


notion of Markov chains, which played an
important role in stochastic processes
theory and its applications. The modern
theory of probability based on the measure
theory was developed by Andrey
Kolmogorov in 1931.[22]
On the geometric side, contributors to The
Educational Times were influential (Miller,
Crofton, McColl, Wolstenholme, Watson,
and Artemas Martin).[23] See integral
geometry for more info.

Theory
Like other theories, the theory of
probability is a representation of its
concepts in formal terms—that is, in terms
that can be considered separately from
their meaning. These formal terms are
manipulated by the rules of mathematics
and logic, and any results are interpreted
or translated back into the problem
domain.

There have been at least two successful


attempts to formalize probability, namely
the Kolmogorov formulation and the Cox
formulation. In Kolmogorov's formulation
(see also probability space), sets are
interpreted as events and probability as a
measure on a class of sets. In Cox's
theorem, probability is taken as a primitive
(i.e., not further analyzed), and the
emphasis is on constructing a consistent
assignment of probability values to
propositions. In both cases, the laws of
probability are the same, except for
technical details.

There are other methods for quantifying


uncertainty, such as the Dempster–Shafer
theory or possibility theory, but those are
essentially different and not compatible
with the usually-understood laws of
probability.

Applications
Probability theory is applied in everyday
life in risk assessment and modeling. The
insurance industry and markets use
actuarial science to determine pricing and
make trading decisions. Governments
apply probabilistic methods in
environmental regulation, entitlement
analysis (reliability theory of aging and
longevity), and financial regulation.

A good example of the use of probability


theory in equity trading is the effect of the
perceived probability of any widespread
Middle East conflict on oil prices, which
have ripple effects in the economy as a
whole. An assessment by a commodity
trader that a war is more likely can send
that commodity's prices up or down, and
signals other traders of that opinion.
Accordingly, the probabilities are neither
assessed independently nor necessarily
rationally. The theory of behavioral finance
emerged to describe the effect of such
groupthink on pricing, on policy, and on
peace and conflict.[24]

In addition to financial assessment,


probability can be used to analyze trends
in biology (e.g., disease spread) as well as
ecology (e.g., biological Punnett squares).
As with finance, risk assessment can be
used as a statistical tool to calculate the
likelihood of undesirable events occurring,
and can assist with implementing
protocols to avoid encountering such
circumstances. Probability is used to
design games of chance so that casinos
can make a guaranteed profit, yet provide
payouts to players that are frequent
enough to encourage continued play.[25]

The discovery of rigorous methods to


assess and combine probability
assessments has changed society.[26]

Another significant application of


probability theory in everyday life is
reliability. Many consumer products, such
as automobiles and consumer electronics,
use reliability theory in product design to
reduce the probability of failure. Failure
probability may influence a manufacturer's
decisions on a product's warranty.[27]
The cache language model and other
statistical language models that are used
in natural language processing are also
examples of applications of probability
theory.

Mathematical treatment

Calculation of probability (risk) vs odds

Consider an experiment that can produce


a number of results. The collection of all
possible results is called the sample
space of the experiment, sometimes
denoted as .[28] The power set of the
sample space is formed by considering all
different collections of possible results.
For example, rolling a die can produce six
possible results. One collection of
possible results gives an odd number on
the die. Thus, the subset {1,3,5} is an
element of the power set of the sample
space of dice rolls. These collections are
called "events". In this case, {1,3,5} is the
event that the die falls on some odd
number. If the results that actually occur
fall in a given event, the event is said to
have occurred.
A probability is a way of assigning every
event a value between zero and one, with
the requirement that the event made up of
all possible results (in our example, the
event {1,2,3,4,5,6}) is assigned a value of
one. To qualify as a probability, the
assignment of values must satisfy the
requirement that for any collection of
mutually exclusive events (events with no
common results, such as the events {1,6},
{3}, and {2,4}), the probability that at least
one of the events will occur is given by the
sum of the probabilities of all the
individual events.[29]
The probability of an event A is written as
,[28][30] , or .[31] This
mathematical definition of probability can
extend to infinite sample spaces, and even
uncountable sample spaces, using the
concept of a measure.

The opposite or complement of an event A


is the event [not A] (that is, the event of A
not occurring), often denoted as
,[28] , or ; its probability is
given by P(not A) = 1 − P(A).[32] As an
example, the chance of not rolling a six on
a six-sided die is
1 – (chance of rolling a six)
. For a more
comprehensive treatment, see
Complementary event.

If two events A and B occur on a single


performance of an experiment, this is
called the intersection or joint probability
of A and B, denoted as .[28]

Independent events …

If two events, A and B are independent


then the joint probability is[30]

For example, if two coins are flipped, then


the chance of both being heads is
.[33]

Mutually exclusive events …

If either event A or event B can occur but


never both simultaneously, then they are
called mutually exclusive events.

If two events are mutually exclusive, then


the probability of both occurring is
denoted as and

If two events are mutually exclusive, then


the probability of either occurring is
denoted as and
For example, the chance of rolling a 1 or 2
on a six-sided die is

Not mutually exclusive events …

If the events are not mutually exclusive


then

For example, when drawing a single card


at random from a regular deck of cards,
the chance of getting a heart or a face
card (J,Q,K) (or one that is both) is
, since among the
52 cards of a deck, 13 are hearts, 12 are
face cards, and 3 are both: here the
possibilities included in the "3 that are
both" are included in each of the "13
hearts" and the "12 face cards", but should
only be counted once.

Conditional probability …

Conditional probability is the probability of


some event A, given the occurrence of
some other event B. Conditional
probability is written ,[28] and is
read "the probability of A, given B". It is
defined by[34]
If then is formally
undefined by this expression. However, it
is possible to define a conditional
probability for some zero-probability
events using a σ-algebra of such events
(such as those arising from a continuous
random variable).

For example, in a bag of 2 red balls and 2


blue balls (4 balls in total), the probability
of taking a red ball is ; however, when
taking a second ball, the probability of it
being either a red ball or a blue ball
depends on the ball previously taken. For
example, if a red ball was taken, then the
probability of picking a red ball again
would be , since only 1 red and 2 blue
balls would have been remaining.

Inverse probability …

In probability theory and applications,


Bayes' rule relates the odds of event to
event , before (prior to) and after
(posterior to) conditioning on another
event . The odds on to event is
simply the ratio of the probabilities of the
two events. When arbitrarily many events
are of interest, not just two, the rule can
be rephrased as posterior is proportional
to prior times likelihood,
where the
proportionality symbol means that the left
hand side is proportional to (i.e., equals a
constant times) the right hand side as
varies, for fixed or given (Lee, 2012;
Bertsch McGrayne, 2012). In this form it
goes back to Laplace (1774) and to
Cournot (1843); see Fienberg (2005). See
Inverse probability and Bayes' rule.

Summary of probabilities …
Summary of probabilities
Event Probability

not A

A or B

A and B

A given B

Relation to randomness and


probability in quantum
mechanics
In a deterministic universe, based on
Newtonian concepts, there would be no
probability if all conditions were known
(Laplace's demon), (but there are
situations in which sensitivity to initial
conditions exceeds our ability to measure
them, i.e. know them). In the case of a
roulette wheel, if the force of the hand and
the period of that force are known, the
number on which the ball will stop would
be a certainty (though as a practical
matter, this would likely be true only of a
roulette wheel that had not been exactly
levelled – as Thomas A. Bass' Newtonian
Casino revealed). This also assumes
knowledge of inertia and friction of the
wheel, weight, smoothness and roundness
of the ball, variations in hand speed during
the turning and so forth. A probabilistic
description can thus be more useful than
Newtonian mechanics for analyzing the
pattern of outcomes of repeated rolls of a
roulette wheel. Physicists face the same
situation in kinetic theory of gases, where
the system, while deterministic in principle,
is so complex (with the number of
molecules typically the order of magnitude
of the Avogadro constant 6.02 × 1023) that
only a statistical description of its
properties is feasible.

Probability theory is required to describe


quantum phenomena.[35] A revolutionary
discovery of early 20th century physics
was the random character of all physical
processes that occur at sub-atomic scales
and are governed by the laws of quantum
mechanics. The objective wave function
evolves deterministically but, according to
the Copenhagen interpretation, it deals
with probabilities of observing, the
outcome being explained by a wave
function collapse when an observation is
made. However, the loss of determinism
for the sake of instrumentalism did not
meet with universal approval. Albert
Einstein famously remarked in a letter to
Max Born: "I am convinced that God does
not play dice".[36] Like Einstein, Erwin
Schrödinger, who discovered the wave
function, believed quantum mechanics is a
statistical approximation of an underlying
deterministic reality.[37] In some modern
interpretations of the statistical
mechanics of measurement, quantum
decoherence is invoked to account for the
appearance of subjectively probabilistic
experimental outcomes.

See also
Chance (disambiguation)
Class membership probabilities
Contingency
Equiprobability
Heuristics in judgment and decision-
making
Probability theory
Randomness
Statistics
Estimators
Estimation Theory
Probability density function
Pairwise independence
In Law
Balance of probabilities

Notes
1. Strictly speaking, a probability of 0
indicates that an event almost never
takes place, whereas a probability of 1
indicates than an event almost
certainly takes place. This is an
important distinction when the sample
space is infinite. For example, for the
continuous uniform distribution on the
real interval [5, 10], there are an infinite
number of possible outcomes, and the
probability of any given outcome being
observed — for instance, exactly 7 — is
0. This means that when we make an
observation, it will almost surely not
be exactly 7. However, it does not
mean that exactly 7 is impossible.
Ultimately some specific outcome
(with probability 0) will be observed,
and one possibility for that specific
outcome is exactly 7.

References
1. "Kendall's Advanced Theory of
Statistics, Volume 1: Distribution
Theory", Alan Stuart and Keith Ord, 6th
Ed, (2009), ISBN 978-0-534-24312-8.
2. William Feller, An Introduction to
Probability Theory and Its
Applications, (Vol 1), 3rd Ed, (1968),
Wiley, ISBN 0-471-25708-7.
3. Probability Theory The Britannica
website
4. Hacking, Ian (1965). The Logic of
Statistical Inference. Cambridge
University Press. ISBN 978-0-521-
05165-1.
5. Finetti, Bruno de (1970). "Logical
foundations and measurement of
subjective probability". Acta
Psychologica. 34: 129–145.
doi:10.1016/0001-6918(70)90012-0 .
. Hájek, Alan (21 October 2002). Edward
N. Zalta (ed.). "Interpretations of
Probability" . The Stanford
Encyclopedia of Philosophy (Winter
2012 ed.). Retrieved 22 April 2013.
7. Hogg, Robert V.; Craig, Allen; McKean,
Joseph W. (2004). Introduction to
Mathematical Statistics (6th ed.).
Upper Saddle River: Pearson.
ISBN 978-0-13-008507-8.
. Jaynes, E.T. (2003). "Section 5.3
Converging and diverging views". In
Bretthorst, G. Larry (ed.). Probability
Theory: The Logic of Science (1 ed.).
Cambridge University Press. ISBN 978-
0-521-59271-0.
9. Hacking, I. (2006) The Emergence of
Probability: A Philosophical Study of
Early Ideas about Probability, Induction
and Statistical Inference, Cambridge
University Press, ISBN 978-0-521-
68557-3
10. Freund, John. (1973) Introduction to
Probability. Dickenson ISBN 978-0-
8221-0078-2 (p. 1)
11. Jeffrey, R.C., Probability and the Art of
Judgment, Cambridge University
Press. (1992). pp. 54–55 . ISBN 0-521-
39459-7
12. Franklin, J. (2001) The Science of
Conjecture: Evidence and Probability
Before Pascal, Johns Hopkins
University Press. (pp. 22, 113, 127)
13. Broemeling, Lyle D. (1 November
2011). "An Account of Early Statistical
Inference in Arab Cryptology". The
American Statistician. 65 (4): 255–
257. doi:10.1198/tas.2011.10191 .
S2CID 123537702 .
14. Some laws and problems in classical
probability and how Cardano
anticipated them Gorrochum, P.
Chance magazine 2012
15. Abrams, William, A Brief History of
Probability , Second Moment, retrieved
23 May 2008
1 . Ivancevic, Vladimir G.; Ivancevic, Tijana
T. (2008). Quantum leap : from Dirac
and Feynman, across the universe, to
human body and mind. Singapore ;
Hackensack, NJ: World Scientific.
p. 16. ISBN 978-981-281-927-7.
17. Franklin, James (2001). The Science
of Conjecture: Evidence and
Probability Before Pascal. Johns
Hopkins University Press. ISBN 978-0-
8018-6569-5.
1 . Shoesmith, Eddie (November 1985).
"Thomas Simpson and the arithmetic
mean" . Historia Mathematica. 12 (4):
352–355. doi:10.1016/0315-
0860(85)90044-8 .
19. Wilson EB (1923) "First and second
laws of error". Journal of the American
Statistical Association, 18, 143
20. Seneta, Eugene William. " "Adrien-
Marie Legendre" (version 9)" .
StatProb: The Encyclopedia Sponsored
by Statistics and Probability Societies.
Archived from the original on 3
February 2016. Retrieved 27 January
2016.
21. Weber, Richard. "Markov Chains"
(PDF). Statistical Laboratory. University
of Cambridge.
22. Vitanyi, Paul M.B. (1988). "Andrei
Nikolaevich Kolmogorov" . CWI
Quarterly (1): 3–18. Retrieved
27 January 2016.
23. Wilcox, Rand R. (10 May 2016).
Understanding and applying basic
statistical methods using R. Hoboken,
New Jersey. ISBN 978-1-119-06140-3.
OCLC 949759319 .
24. Singh, Laurie (2010) "Whither Efficient
Markets? Efficient Market Theory and
Behavioral Finance". The Finance
Professionals' Post, 2010.
25. Gao, J.Z.; Fong, D.; Liu, X. (April 2011).
"Mathematical analyses of casino
rebate systems for VIP gambling".
International Gambling Studies. 11 (1):
93–106.
doi:10.1080/14459795.2011.552575 .
S2CID 144540412 .
2 . "Data: Data Analysis, Probability and
Statistics, and Graphing" .
archon.educ.kent.edu. Retrieved
28 May 2017.
27. Gorman, Michael F. (2010).
"Management Insights" . Management
Science. 56: iv–vii.
doi:10.1287/mnsc.1090.1132 .
2 . "List of Probability and Statistics
Symbols" . Math Vault. 26 April 2020.
Retrieved 10 September 2020.
29. Ross, Sheldon M. (2010). A First
course in Probability (8th ed.). Pearson
Prentice Hall. pp. 26–27.
ISBN 9780136033134.
30. Weisstein, Eric W. "Probability" .
mathworld.wolfram.com. Retrieved
10 September 2020.
31. Olofsson (2005) p. 8.
32. Olofsson (2005), p. 9
33. Olofsson (2005) p. 35.
34. Olofsson (2005) p. 29.
35. Burgin, Mark (2010). "Interpretations
of Negative Probabilities": 1.
arXiv:1008.1287v1 .
3 . Jedenfalls bin ich überzeugt, daß der
Alte nicht würfelt. Letter to Max Born,
4 December 1926, in: Einstein/Born
Briefwechsel 1916–1955 .
37. Moore, W.J. (1992). Schrödinger: Life
and Thought. Cambridge University
Press. p. 479. ISBN 978-0-521-43767-
7.

Bibliography
Kallenberg, O. (2005) Probabilistic
Symmetries and Invariance Principles.
Springer-Verlag, New York. 510
pp. ISBN 0-387-25115-4
Kallenberg, O. (2002) Foundations of
Modern Probability, 2nd ed. Springer
Series in Statistics. 650 pp. ISBN 0-387-
95313-2
Olofsson, Peter (2005) Probability,
Statistics, and Stochastic Processes,
Wiley-Interscience. 504 pp ISBN 0-471-
67969-0.

External links

Wikiquote has quotations related to:


Probability
Wikibooks has more on the topic of:
Probability

Wikimedia Commons has media


related to Probability.

Virtual Laboratories in Probability and


Statistics (Univ. of Ala.-Huntsville)
Probability on In Our Time at the BBC
Probability and Statistics EBook
Edwin Thompson Jaynes. Probability
Theory: The Logic of Science. Preprint:
Washington University, (1996). — HTML
index with links to PostScript files and
PDF (first three chapters)
People from the History of Probability
and Statistics (Univ. of Southampton)
Probability and Statistics on the Earliest
Uses Pages (Univ. of Southampton)
Earliest Uses of Symbols in Probability
and Statistics on Earliest Uses of
Various Mathematical Symbols
A tutorial on probability and Bayes'
theorem devised for first-year Oxford
University students
[1] pdf file of An Anthology of Chance
Operations (1963) at UbuWeb
Introduction to Probability – eBook , by
Charles Grinstead, Laurie Snell Source
(GNU Free Documentation License)
(in English and Italian) Bruno de Finetti,
Probabilità e induzione , Bologna, CLUEB,
1993. ISBN 88-8091-176-7 (digital
version)
Richard P. Feynman's Lecture on
probability.

Retrieved from
"https://en.wikipedia.org/w/index.php?
title=Probability&oldid=995101025"

Last edited 11 days ago by Citation bot

Content is available under CC BY-SA 3.0 unless


otherwise noted.

You might also like