100% found this document useful (1 vote)
118 views9 pages

Information & Entropy: Comp 595 DM Professor Wang

Information and entropy can be calculated using probability formulas. The information equation uses probability (p) and a base (usually 2 for bits) to calculate information. Entropy is the average (expected) amount of information from an event and is calculated using a summation of the probability of each outcome multiplied by its information value. While information can vary, entropy always ranges between 0 and the log of the number of possible outcomes, where 0 is lowest entropy and the log value is highest entropy.

Uploaded by

akshay gawade
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
118 views9 pages

Information & Entropy: Comp 595 DM Professor Wang

Information and entropy can be calculated using probability formulas. The information equation uses probability (p) and a base (usually 2 for bits) to calculate information. Entropy is the average (expected) amount of information from an event and is calculated using a summation of the probability of each outcome multiplied by its information value. While information can vary, entropy always ranges between 0 and the log of the number of possible outcomes, where 0 is lowest entropy and the log value is highest entropy.

Uploaded by

akshay gawade
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Information & Entropy

Comp 595 DM
Professor Wang
Information & Entropy
• Information Equation

p = probability of the event happening


b = base
(base 2 is mostly used in information theory)
*unit of information is determined by base
base 2 = bits base 3 = trits
base 10 = Hartleys base e = nats
Information & Entropy
• Example of Calculating Information
Coin Toss
There are two probabilities in fair coin, which
are head(.5) and tail(.5).
So if you get either head or tail you will get 1
bit of information through following formula.
I(head) = - log (.5) = 1 bit
Information & Entropy
• Another Example
Balls in the bin
The information you will get by choosing a ball
from the bin are calculated as following.
I(red ball) = - log(4/9) = 1.1699 bits
I(yellow ball) = - log(2/9) = 2.1699 bits
I(green ball) = - log(3/9) = 1.58496 bits
Information & Entropy
• Then, what is Entropy?
- Entropy is simply the average(expected) amount
of the information from the event.
• Entropy Equation

n = number of different outcomes


Information & Entropy
• How was the entropy equation is derived?
I = total information from N occurrences
N = number of occurrences
(N*Pi) = Approximated number that the
certain result will come out in N
occurrence

So when you look at the difference between


the total Information from N occurrences
and the Entropy equation, only thing that
changed in the place of N. The N is moved
to the right, which means that I/N is
Entropy. Therefore, Entropy is the
average(expected) amount of information
in a certain event.
Information & Entropy
• Let’s look at this example again…
Calculating the entropy….
In this example there are three outcomes possible
when you choose the ball, it can be either red, yellow,
or green. (n = 3)
So the equation will be following.
Entropy = - (4/9) log(4/9) + -(2/9) log(2/9)
+ - (3/9) log(3/9)
= 1.5304755
Therefore, you are expected to get 1.5304755
information each time you choose a ball from the bin
Clear things up.
• Does Entropy have range from 0 to 1?
– No. However, the range is set based on the
number of outcomes.
– Equation for calculating the range of Entropy:
0 ≤ Entropy ≤ log(n), where n is number of
outcomes
– Entropy 0(minimum entropy) occurs when one of
the probabilities is 1 and rest are 0’s
– Entropy log(n)(maximum entropy) occurs when
all the probabilities have equal values of 1/n.
If you want more information…
• http://csustan.csustan.edu/~tom/sfi-csss/info-
theory/info-lec.pdf
– Look at pages from 15 to 34. This is what I read and
prepared all the information that are on the current
powerpoint slides. Very simple and easy for students
to understand.

• http://ee.stanford.edu/~gray/it.pdf
– Look at chapter two of this pdf file, it has very good
detailed explanation of Entropy and Information
theory.

You might also like