Wednesday, February 2, 2011

Entropy (Information Theory)

http://en.wikipedia.org/wiki/Entropy_(information_theory)

In information theory, entropy is a measure of the uncertainty associated with a random variable. In this context, the term usually refers to the Shannon entropy, which quantifies the expected value of the information contained in a message, usually in units such as bits.

Entropy is a measure of disorder, or more precisely unpredictability.

The extreme case is that of a double-headed coin which never comes up tails. Then there is no uncertainty. The entropy is zero: each toss of the coin delivers no information.

No comments: