0% found this document useful (0 votes)
14 views6 pages

5 BPHYS102 MODULE V CH 2 Statstical Physics For Computation

The document discusses the applications of statistical physics in computing, focusing on descriptive and inferential statistics, including their definitions and differences. It also covers the Poisson distribution, modeling proton decay, normal distribution, standard deviations, and the Monte Carlo method for approximating values like π. The content is aimed at students in the CSE stream, highlighting the relevance of statistical methods in experimental and computational physics.

Uploaded by

Raksha Anand
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views6 pages

5 BPHYS102 MODULE V CH 2 Statstical Physics For Computation

The document discusses the applications of statistical physics in computing, focusing on descriptive and inferential statistics, including their definitions and differences. It also covers the Poisson distribution, modeling proton decay, normal distribution, standard deviations, and the Monte Carlo method for approximating values like π. The content is aimed at students in the CSE stream, highlighting the relevance of statistical methods in experimental and computational physics.

Uploaded by

Raksha Anand
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

APPLIED PHYSICS FOR CSE

Module 5 : APPLICATIONS OF PHYSICS IN COMPUTING

Ch2 Statistical Physics for Computing

Descriptive and Inferential Statistics

Descriptive Statistics: Descriptive statistic s is a term given to the analysis of data that helps
to describe, show, and summarize data in a meaningful way. It is a simple way to describe our
data. Descriptive statistics is very important to present our raw data ineffective/meaningful way
using numerical calculations or graphs or tables. This type of statistics is applied to already
known data.
Inferential Statistics: In inferential statistics, predictions are made by taking any group of data
in which you are interested. It can be defined as a random sample of data taken from a
population to describe and make inferences about the population. Any group of data that
includes all the data you are interested in is known as population. It basically allows you to
make predictions by taking a small sample instead of working on the whole population.
Difference between Descriptive and Inferential statistics:

Sl. Descriptive Statistics Inferential Statistics


No.
1 It gives information about raw data It makes inferences about the population using
which describes the data in some data drawn from the population.
manner.
2 It helps in organizing, analyzing, and to It allows us to compare data, and make
present data in a meaningful manner. hypotheses and predictions.
3 It is used to describe a situation. It is used to explain the chance of occurrence of
an event.
4 It explains already known data and is It attempts to reach the conclusion about the
limited to a sample or population having population.
a small size.
5 It can be achieved with the help of It can be achieved by probability.
charts, graphs, tables, etc.

Subject: Physics for CSE stream Faculty: Dr Anita R Shettar, Dr. Soumya & Prof. Pallavi B Page 1 of 6
Poisson Distribution
If the probability p is so small that the function has significant value only for very small k, then
the distribution of events can be approximated by the Poisson Distribution.
Probability mass function
A discrete Random variable X is said to have a Poisson distribution, with parameter, if it has a
probability Mass Function given by
λ𝑘 𝑒 − λ
f ( k ; λ)=P(X=k) =
𝑘!

Here k is the number of occurrences, e is Euler’s Number, ! is the factorial function. The
positive real number λ is equal to the expected value of X and also to its Variance. The Poisson
distribution may be used in the design of experiments such as scattering experiments where a
small number of events are seen.

Example of probability for Poisson distributions


On a particular river, overflow floods occur once every 100 years on average. Calculate the
probability of k = 0, 1, 2, 3,
4, 5, or 6 overflow floods in a 100 year interval, assuming the Poisson model is appropriate.
Because the average event
rate is one overflow flood per 100 years, λ = 1.
λ𝑘 𝑒 − λ
f ( k ; λ)=P( X=k )=
𝑘!

Subject: Physics for CSE stream Faculty: Dr Anita R Shettar, Dr. Soumya & Prof. Pallavi B Page 2 of 6
λ𝑘 𝑒 − λ 1𝑘 𝑒 − 1
P(k overflow floods in 100 years) = =
𝑘! 𝑘!

λ𝑘 𝑒 − λ 10 𝑒 − 1 𝑒 −1
P(k=0 overflow floods in 100 years) = = = = 0.368
𝑘! 0! 1

λ𝑘 𝑒 − λ 11 𝑒 − 1 𝑒 −1
P(k=1 overflow floods in 100 years) = = = = 0.368
𝑘! 1! 1

λ𝑘 𝑒 − λ 12 𝑒 − 1 𝑒 −1
P(k=2 overflow floods in 100 years) = = = = 0.184
𝑘! 2! 2

Modeling the Probability for Proton Decay


The experimental search for Proton Decay was undertaken because of the implications of the
Grand unification Theories. The lower bound for the lifetime is now projected to be on the
order of τ = 1033 Years. The probability for observing a proton decay can be estimated from
the nature of particle decay and the application of Poisson Statistics.
The number of protons N can be modelled by the decay equation
N=N0 𝑒 −λt
Here λ = 1/t= 10-33/ year is the probability that any given proton will decay in a year. Since the
decay constant λ is so small, the exponential can be represented by the first two terms of the
Exponential Series.
𝑒 −λt =1−λ t ,thus N ≈N0(1−λ t )
For a small sample, the observation of a proton decay is infinitesimal, but suppose we consider
the volume of protons represented by the Super Kameokande neutrino detector in Japan. The
number of protons in the detector volume is reported by Ed Kearns of Boston University to be
7.5 x 1033protons. For one year of observation, the number of expected proton decays is then
N −N0=N0 λ t = (7.5×1033 protons) (10−33 / year) (1 year) =7.5
About 40% of the area around the detector tank is covered by photo-detector tubes, and if we
take that to be the nominal efficiency of detection, we expect about three observations of proton
decay events per year based on a 1033 year lifetime.

Subject: Physics for CSE stream Faculty: Dr Anita R Shettar, Dr. Soumya & Prof. Pallavi B Page 3 of 6
So far, no convincing proton decay events have been seen. Poisson statistics provides a
convenient means for assessing the implications of the absence of these observations. If we
presume that λ = 3 observed decays per year is the mean, then the Poisson distribution function
tells us that the probability for zero observations of a decay is
λ𝑘 𝑒 − λ λ𝑘 𝑒 − λ
p (k) = p(k) = = 0.05
𝑘! 𝑘!

This low probability for a null result suggests that the proposed lifetime of 1033 years is too
short. While this is not a realistic assessment of the probability of observations because there
are a number of possible pathways for decay, it serves to illustrate in principle how even a non-
observation can be used to refine a proposed lifetime.

Normal Distribution and Bell Curves


A bell curve is a common type of distribution for a variable, also known as the normal
distribution. The term "bell curve" originates from the fact that the graph used to depict a
Normal Distribution consists of a symmetrical bell-shaped curve.
The highest point on the curve, or the top of the bell, represents the most probable event in a
series of data (its Mean, Mode and Median in this case), while all other possible occurrences
are symmetrically distributed around the mean, creating a downward-sloping curve on each
side of the peak. The width of the bell curve is described by its Standard Deviation.
The term "bell curve" is used to describe a graphical depiction of a normal probability
distribution, whose underlying standard deviations from the mean create the curved bell shape.
A standard deviation is a measurement used to quantify the variability of data dispersion, in a
set of given values around the mean. The mean, in turn, refers to the average of all data points
in the data set or sequence and will be found at the highest point on the bell curve.

Standard Deviations
The Standard Deviation is a measure of how spread out numbers are. 68% of values
are within 1 standard deviation of the mean. 95% of values are within 2 standard deviations of
the mean. 99.7%of values are within 3 standard deviations of the mean

Subject: Physics for CSE stream Faculty: Dr Anita R Shettar, Dr. Soumya & Prof. Pallavi B Page 4 of 6
Monte-Carlo Method
Monte Carlo methods vary, but tend to follow a particular pattern:
1. Define a domain of possible inputs
2. Generate inputs randomly from a probability distribution over the domain
3. Perform a deterministic computation on the inputs
4. Aggregate the results
Monte Carlo method applied to approximating the value of π. For example, consider a quadrant
inscribed in a unit square. Given that the ratio of their areas is π/ 4, the value of π can be
approximated using a Monte Carlo method:

1. Draw a square, then Inscribe a quadrant within it


2. Uniformly scatter a given number of points over the square
3. Count the number of points inside the quadrant, i.e., having a distance from the origin of <
1
4. The ratio of the inside-count and the total-sample-count is an estimate of the ratio of the
two areas, π/ 4.
Multiply the result by 4 to estimate π.
Subject: Physics for CSE stream Faculty: Dr Anita R Shettar, Dr. Soumya & Prof. Pallavi B Page 5 of 6
In this procedure the domain of inputs is the square that circumscribes the quadrant. We
generate random inputs by scattering grains over the square then perform a computation on
each input (test whether it falls within the quadrant).
Aggregating the results yields our result, the approximation of π.
There are two important considerations:
1. If the points are not uniformly distributed, then the approximation will be poor.
2. There are many points. The approximation is generally poor if only a few points are
randomly placed in the whole square. On average, the approximation improves as more points
are placed.
Uses of Monte Carlo methods require large amounts of random numbers, and their use
benefited greatly from Pseudo random number generators, which were far quicker to use than
the tables of random numbers that had been previously used for statistical sampling

Subject: Physics for CSE stream Faculty: Dr Anita R Shettar, Dr. Soumya & Prof. Pallavi B Page 6 of 6

You might also like