Jump to content

Information theory

From Simple English Wikipedia, the free encyclopedia
The printable version is no longer supported and may have rendering errors. Please update your browser bookmarks and please use the default browser print function instead.

Information theory is a branch of applied mathematics and electrical engineering. Information theory measures the amount of information in data that could have more than one value. In its most common use, information theory finds physical and mathematical limits on the amounts of data in data compression and data communication. Data compression and data communication are statistical, because they guess unknown values. The amount of information in data measures how easily it is guessed by a person who does not know its value.

A key measure in information theory is "entropy". Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a dice (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.