0% found this document useful (0 votes)
14 views3 pages

HW 1 QIQC Fall 2023

This document contains code and explanations related to plotting functions involving logarithms and probability density functions. It explores relative entropy and mutual information by plotting -Log[x] and 1-x. It also plots Normal distributions with different standard deviations and shows overlapping vs disjoint distributions by plotting tables of probability densities. Finally, it plots different base logarithm functions to show they have the same asymptotic form as natural logarithms.

Uploaded by

Enrique Segura
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views3 pages

HW 1 QIQC Fall 2023

This document contains code and explanations related to plotting functions involving logarithms and probability density functions. It explores relative entropy and mutual information by plotting -Log[x] and 1-x. It also plots Normal distributions with different standard deviations and shows overlapping vs disjoint distributions by plotting tables of probability densities. Finally, it plots different base logarithm functions to show they have the same asymptotic form as natural logarithms.

Uploaded by

Enrique Segura
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

In[!

]:= ClearAll["Global'*"]

Problem 1.1: Relative entropy and mutual information.


Part a) Make a plot of - Log[x] and 1-x
In this plot we find the two functions of interest around the point of interest in which there is equality.
In[! ]:= Plot[{- Log[x], 1 - x}, {x, 0, 1}, Frame → True,
FrameStyle → Large, FrameLabel → {"x", "F[x]"},
ImageSize → Large, PlotLegends → {"Log[x] (Surprisal)", "1-x"},
PlotPoints → 400, PlotStyle → {{Blue}, {Purple}}]
Out[! ]=

2.0

1.5
F[x]

1.0 Log[x]
1-x

0.5

0.0
0.0 0.2 0.4 0.6 0.8 1.0
x
At a glance we can say the following:
- Log[x] >= 1-x becomes Log 1x  ≥ 1 - x becomes 1x ≥ e1-x
Clearly for x= 1, we find that 1 = 1 as expected. So this is the unique value. Below is the “sanity” check:
In[! ]:= FindRoot[- Log[x] - (1 - x), {x, 0.3}]
Out[! ]=

{x → 1.}

Printed by Wolfram Mathematica Student Edition


2 Hw 1 QIQC Fall 2023.nb

In[! ]:= Plot[Table[PDF[NormalDistribution[0, σ], x], {σ, {.75, 1, 2}}] // Evaluate,


{x, - 6, 6}, Filling → Axis]
Out[! ]=

In[! ]:= TableofPs = Table[PDF[NormalDistribution[0, .25], x], {x, - 2, 2, .01}];


TableofQs = Table[PDF[NormalDistribution[0, .25], x], {x, - 6, 6, .01}];

The idea of this plot is to show the contrast between overlapping distributions above and disjointed
q
distributions below. We define p = ϵ as the parameter that describes how disjointed p and q are.

In[! ]:= ListPlot[{TableofPs, TableofQs}, Filling → Axis,


Joined → True, PlotRange → All, PlotLegends → Automatic]

p
q

This plot shows that for a variable log base the asymptotic form is the same as Ln. Thus we find that the
relative entropy is non negative for any of these log bases.

Printed by Wolfram Mathematica Student Edition


Hw 1 QIQC Fall 2023.nb 3

In[! ]:= Plot[{Log[2, x], Log[4, x], Log[6, x], Log[8, x], Log[10, x]}, {x, 0, 1},
Frame → True, FrameStyle → Large, FrameLabel → {"x", "Logb [x]"},
ImageSize → Large, PlotLegends → "Expressions", PlotRange → {0, - 15}]
Out[! ]=

0
-2
-4
-6
Logb[x]

log2 (x)
log4 (x)
-8 log6 (x)

-10 log8 (x)


log10 (x
-12
-14
0.0 0.2 0.4 0.6 0.8 1.0
x

Printed by Wolfram Mathematica Student Edition

You might also like