Oxford University Press's
Academic Insights for the Thinking World

Richard Dawkins: Information Theory Podcast
Week Seven

Yes, we really are up to week seven in our series of podcasts from Richard Dawkins! Don’t forget to listen to the previous podcasts. Richard Dawkins is the bestselling author of The Selfish Gene and The God Delusion. He’s also a pre-eminent scientist, the first holder of the Charles Simonyi Chair of the Public Understanding of Science at Oxford, and is a fellow of New College, Oxford. His most recent book is The Oxford Guide to Modern Science Writing, a collection of the best science writing in the last century.

This week, Dawkins tells us about Claude Shannon and Warren Weaver, scientists who have been greatly influential in the study of communications. Shannon is best know for inventing information theory.

[audio:claudeshannon.mp3] Transcript after the jump.

DORIAN DEVINS: Claude Shannon, another interesting person, of course, mathematician and computer giant.

RICHARD DAWKINS: Yes, Claude Shannon. I think he didn’t work for a university, I think he worked for the Bell Telephone labs, and he invented information theory. Information theory is really very important in all sorts of different fields. For Shannon himself, as a telephone engineer, what he was trying to get at was a sort of metric of the economics of communication. Clearly, it costs money to send messages via telephone wire, or through a satellite, or wherever messages are sent, and Shannon worked out how to measure the information content of messages. And he realized that a lot of a message is actually unnecessary. He called it redundant. If you’re being very economical with your message, you don’t want to send redundancy. Redundancy is something which is perhaps unnecessary because it’s already there in other parts of the message. Telegrams are—well, they don’t exist anymore, do they—when one used to send telegrams, they left out an awful lot of words which, because they were redundant you left out the word “the” and the word “a” and the word “an,” for example. So Shannon mathematicized that, if that’s the right word and worked out how to measure the true information content of a message. His unit of information was the bit, the binary digit, and the bit (one bit) is a measure of the reduction in uncertainty of the receiver of a message. So if you know that a baby’s been born, and you want to know what sex it is, is it a boy or a girl, and I tell you it’s a girl, then you’ve received one bit on information, because one bit is the amount of information needed to halve the prior uncertainty of the receiver. The prior uncertainty of the receiver in the case of a baby being born is that it could be a boy or it could be a girl, it’s 50/50, approximately 50/50. So there are two possibilities in the receiver’s mind before the message comes. When the message comes, that’s been divided and there’s only one. That’s one bit of information. That’s a very, very simple example, and what Shannon worked out was how to do the same kind of trick for any message, not just simple messages like it’s a boy or it’s a girl, but any message you like. And information theory, as measured in bits, it’s a logarithmic measure, information theory as measured in bits is used all over the place. Not just in communication theory any more, but in ecology, in the study of animal communication, in the study of animal diversity, in all sorts of fields, information theory is used. Shannon was a very seminal figure.

DEVINS: You also have Warren Weaver, who wrote with Claude Shannon.

DAWKINS: Yes, I think Weaver, what you should think of is just a person who collaborated with Shannon and wrote perhaps in a more comprehensible style than Shannon did. The most famous paper is by Shannon and Weaver. And in fact in the book, in my Oxford Book of Modern Science Writing, the chapter is actually by Weaver.

Recent Comments

There are currently no comments.