0% found this document useful (0 votes)
15 views3 pages

Problem SET 1

Problem

Uploaded by

archi Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views3 pages

Problem SET 1

Problem

Uploaded by

archi Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Problem-set-I: Probability and Statistics

CHM322: Dr. Arnab Ghosh

January 13, 2025

1. Using Stirling formula, estimate the magnitude of x: 1023 ! ≈ 10x .


2. Let the random variable x takes values 0, 1, and 2 with probabilities 12 , 14 ,
and 41 respectively. Calculate hxi, hx2 i, and σx2 .
3. Suppose that there are n independent random variables {Xi }, i = 1, 2, ...n,
2
each with the same mean hXi and variance σX . Let Y = X1 + X2 + ...Xn ,
be the sum of the all the random numbers. Find the mean and variance.
4. Assuming that experimental errors are random in origin, show that if the
error in making a single measurement of a quantity
√ X is ∆, the error
obtained after making n measurements is ∆/ n. This is the reason
why you are advised to repeat experiments multiple times!
2
/a2
5. Consider Gaussian distribution P (x) = Ae−x . Find hxi, hx2 i, and σx2 .
6. Let x be a random variable which takes values 1 for success and 0 for
failure. Assuming p to be the probability of success, calculate hxi, hx2 i,
and σx .
7. Estimate the standard deviation for fair coin tossing with n = 16 and
n = 1020 . Explain the significance of the results.
8. A throw of a regular dice yields the numbers 1, 2, ...6, each with probability
1/6. Find the mean, variance, and standard deviation of the numbers
obtained.
9. The Poisson distribution of a discrete random variable n = 0, 1, 2, ... is
given by
λn e−λ
P (n) =
n!
P∞
Show that hni = n=0 nP (n) = λ.
10. Exponential distribution of a continuous random variable x ≥ 0 dis-
tributed between x and x + dx is given by

P (x)dx = Ae−x/λ dx

1
R∞
(a) Find normalization constant A. (b) Show that hxi = 0
xP (x)dx = λ.
11. Consider a uniform distribution P (θ) of a continuous random variable θ
distributed between 0 ≤ θ ≤ π. Find the values of (i) hθn i for n ≥ 0 (ii)
hcos θi (iii) hsin θi (iv) hcos2 θi (v) hsin2 θi

12. Show that the Binomial distribution reduces to a Poisson distribu-


tion with mean N p when N  1 but the mean N p remains small.
13. ∗ Show that the Binomial distribution reduces to a Gaussian distri-
bution with mean N p and variance N p(1 − p) when N  1 as well as
N p(1 − p)  1.

14. One dimensional random walk is a succession of n Bernoulli trails in


which the choice is either a step forwards +L or a step backwards −L, each
occurs with probability p = 1/2. Find the mean distance, hxi travelled
and mean squared distance, hx2 i travelled.
15. A pair of (distinguishable) dice is tossed once. Each die can give a score
of 1, 2, 3, 4, 5 or 6. Let s denote the total score of the pair of dice. It is
evident that possible values of s (sample space of s) is a set of integers
from 2 to 12. (a) Write down the set of probabilities {Ps }. What is the
most probable value of s? (b) Find the mean, standard deviation, and
relative fluctuation of s.

16. ∗ Diffusion problem: In the above random walk problem, assume that
the walker takes a step when tine t = nτ , where n being an integer.
Writing D = L2 /2τ , show that when t  τ , the probability of finding a
diffusing particle between x to x + dx is
1 2
P (x)dx = √ e−x /4Dt dx
4πDt

Show that the standard deviation for the diffusion process σx ∝ t1/2 .
Using this estimate the time needed for a molecule to diffuse a distance
about (i) 1µm and (ii) 1 cm, in water. Given the diffusion coefficient of
the molecule in water is D = 10−9 m2 s−1 .
17. ∗ Moment generating function M(t): is an efficient method for calcu-
lating mean (first moment) and variance (second moment) of a probability
distribution. Defining M (t) = hetx i, show that hxn i = M (n) (0), where
M (n) (t) = dn M/dtn . In particular show that σx = M (2) (0) − [M (1) (0)]2 .
Finally, show that (a) for a single Bernoulli trail M (t) = pet + 1 − p (b) for
binomial distribution M (t) = (pet + 1 − p)n (c) for Poisson distribution
t
M (t) = eλ(e −1) (d) for exponential distribution M (t) = λ−t λ
. Check the
mean and variance derived in this method agree with the results obtained
earlier.

2
18. ∗ A colony of 5,000 “red” and 5,000 “green” E. coli is allowed to eat and
reproduce faithfully into red → red+red or green → green+green, with a
reproduction time of 1 hour. Assume that other than the markers “red”
and “green”, there are no differences between them. In order to keep the
colony size down, a predator is introduced which keeps the colony size at
10,000 by eating both bacteria at random. (a) After a very long time,
what is the probability distribution of the number of red bacteria? (b)
What would be the effect of a preference of the predator for eating red
bacteria on (a)?
19. ∗ Consider in every τ seconds, an electron jumps from a atom site to a
nearest-neighbour site (left or right) of an one-dimensional lattice having
lattice constant a. The probability of jump to left and right are p and
q = 1 − p respectively. (a) What is the average position x̄ of the electron
at time t = N τ , for N  1. (b) Calculate the mean-square position
(x − x̄)2 at time t.
20. Consider a large number of N spin-1/2 particles in presence of an external
magnetic field. Find the number of states accessible to the system as a
function of Ms , the z-component of the total spin of the system. Determine
the value of Ms for which the number of accessible states is maximum.
21. ∗ Consider a system of N non-interacting distinguishable particles. Each
particle may exist in one of the two energy states E = 0 and E = ε. (a)
Write down a formula for S(n), where n is the number of particles in the
upper state. Sketch the function S(n). (b) Derive Stirling’s approximation
for large n: ln n! = n ln n − n. (c) Rewrite the result of (a) using the result
of (b). Find the value of n for which S(n) is maximum. (d) Treating
energy as continuous, show that the system can have negative absolute
temperature.
22. What is the Shanon entropy for a Bernoulli trial with probabilities P and
1 − P of the two outcomes?
23. ∗ In a typical microchip, a bit is stored by a 5 fF capacitor using a voltage
of 3V. Calculate the energy stored in eV per bit and compare this with the
minimum heat dissipation by erasure, which is kB T ln 2 per bit, at room
temperature. What is the significance of the above result?
24. ∗ The relative entropy
 
X Pi X
S(P ||Q) = Pi log = Sp − Pi log Qi
i
Qi i

measures
P the closeness of two probability distributions P and Q and Sp =
− i Pi log Pi . Show that (a) S(P ||Q) ≥ 0, with equality iff Pi = Qi ∀ i.
(b) If i takes N values with probabilities Pi , then show that S(P ||Q) =
−Sp + log N , where Qi = 1/N ∀ i. Hence show that Sp ≤ log N , with
equality iff Pi is uniformly distributed between all N outcomes.

You might also like