MEASURING INFORMATION: THE BIT

In any communication system the message produced by the source is one of several possible messages. The receiver will know what these possibilities are but will not know which one has been selected. Shannon observed that any such message can be represented by a sequence of fundamental units called bits, consisting of either 0s or 1s. The number of bits required for a message depends on the number of possible messages: the more possible messages (and hence the more initial uncertainty at the receiver), the more bits required.

As a simple example, suppose a coin is flipped and the outcome (heads or tails) is to be communicated to a person in the next room. The outcome of the flip of a coin can be represented using one bit of information: 0 for heads and 1 for tails. Similarly, the outcome of a football game might also be represented with one bit: 0 if the home team loses and 1 if the home team wins. These examples emphasize one of the limitations of information theory—it cannot measure (and does not attempt to measure) the meaning or the importance of a message. It requires the same amount of information to distinguish heads from tails as it does to distinguish a win from a loss: one bit.

For an example with more than two outcomes, more bits are required. Suppose a playing card is chosen at random from a 52-card deck, and the suit chosen (hearts, spades, clubs, or diamonds) is to be communicated. Communicating the chosen suit (one of four possible messages) requires two bits of information, using the following simple scheme: