Shannon entropy
Noun

Shannon entropy

  1. information entropy
    Shannon entropy H is given by the formula H = - \sum_i p_i \log_b p_i where pi is the probability of character number i appearing in the stream of characters of the message.
         Consider a simple digital circuit which has a two-bit input (X, Y) and a two-bit output (X and Y, X or Y). Assuming that the two input bits X and Y have mutually independent chances of 50% of being HIGH, then the input combinations (0,0), (0,1), (1,0), and (1,1) each have a 1/4 chance of occurring, so the circuit's Shannon entropy on the input side is H(X,Y) = 4\Big(-{1\over 4} \log_2 {1\over 4}\Big) = 2 . Then the possible output combinations are (0,0), (0,1) and (1,1) with respective chances of 1/4, 1/2, and 1/4 of occurring, so the circuit's Shannon entropy on the output side is H(X \text{ and } Y, X \text{ or } Y) = 2\Big(-{1\over 4} \log_2 {1\over 4}\Big) - {1\over 2} \log_2 {1\over 2} = 1 + {1\over 2} = 1 {1\over 2} , so the circuit reduces (or "orders") the information going through it by half a bit of Shannon entropy due to its logical irreversibility.
Related terms


This text is extracted from the Wiktionary and it is available under the CC BY-SA 3.0 license | Terms and conditions | Privacy policy 0.004
Offline English dictionary