Understanding Information

We live in a revolutionary age: the age of information. Never before in history have we been able to acquire, record, communicate, and use so many different forms of information. Never before have we had access to such vast quantities of data of every kind. Never before have been able to translate information from one form to another so easily and quickly. And, of course, never before have we had such a need to comprehend the concepts and principles of information.

Information is a paradox. On the one hand, it is physical. Whenever we communicate or record information, we make use of a physical medium, such as sound, light, radio waves, electrical signals, or magnetic patterns. On the other hand, information is abstract. The messages carried by our physical signals are not identical to the signals themselves.

That dichotomy is the key to the amazing mutability of information: from letter to laser pulse, from radio signal to sound wave, yet somehow remaining the same.

Perhaps the greatest example of information mutability in all of human history is the invention that gave us human history in the first place: writing.

Human speech is a pattern of sound. Spoken words travel a short distance, then fade away. But in writing, these spoken words are rendered as patterns inscribed on a surface. In this form, they become durable rather than ephemeral.

Writing represents speech, and in alphabetic writing, each symbol represents a sound. But the rule of association between the symbol and the message—the code—is entirely arbitrary. There is no necessary connection between one set of scratches and the sound /p/. Writing works because both writer and reader know the code. Without the code, there is no way to understand what the symbols mean.

The essential concepts of information—the concepts of message, symbol, and code—are not new in human experience. What is new is the jaw-dropping multiplicity of uses to which we have put those concepts in the age of information.

 

INFORMATION THEORY

In 1948, the American engineer and mathematician Claude Shannon published a paper that set forth the elements of a new science called information theory.

Shannon first realized that the mathematical idea of information is not the same thing as meaning. The essential communication process is the same no matter what the message means.

In the same way, the idea of information is not about the value or significance of a message. Meaning, value, and significance are obviously important qualities, but they are not the keys to understanding what information is.

What’s left, according to Shannon, is the distinction between different messages. Every message that is communicated is one particular message out of a set of many possible messages. Thus, Shannon’s definition of information is as follows: “Information is the ability to distinguish reliably among possible alternatives.”

If information is all about the distinction between possible messages, then there is a fundamental “atom” of information—the binary distinction between just two possible messages. The messages might be yes and no for a simple question, on and o for an electrical signal, or the binary digits 1 and 0. It doesn’t matter what the messages mean; it only matters that there are two possible alternatives. Such a simple two-value message is called a bit, meaning “binary digit.”

Shannon pointed out that bits form a kind of universal currency for information. Our basic “alphabet” needs only two distinct symbols. And because we can transform information—freely switch from one code to another—any sort of information, such as numbers, words, or pictures, can be represented by bits: arrangements of binary digits, 0s and 1s.

A concrete example is the code that was established for teletype machines. In this code, each letter of the alphabet is represented by 5 bits. The letter A is 11000, the letter B is 10011, a space between letters is 00100, and so on.

(To be continued…)