Claude Shannon and Information Theory

Claude Shannon, “the father of information theory”, was born on April 30, 1916. Shannon was an American mathematician, electrical engineer and cryptographer. His most famous theory, the information theory, is used widely nowadays in data compression, information encryption and quantum computing.

You may use the word “bit” a lot and wondering where this strange word comes from. Well, it comes from Shannon! In 1948, Shannon published a famous paper, A Mathematical Theory of Communication, and used the “bit”, short for “binary digits”, to describe the size of information. After that, this unit of information was accepted and used all over the world.

Thanks to Shannon, now we can use math to measure information. The word “entropy” is used to describe the uncertainty of the information. The higher the entropy, the more possible combinations we have, which means it is harder to guess the “real” combination. This concept is also used in the study of languages. English has an average entropy of 4.03 bits, while Chinese has an average entropy of 9.65 bits. That might be the reason why Chinese is harder to learn.

Published by:


Zwei Dinge erfüllen das Gemüt mit immer neuer und zunehmender Bewunderung und Ehrfurcht, je öfter und anhaltender sich das Nachdenken damit beschäftigt: Der bestirnte Himmel über mir, und das moralische Gesetz in mir.

Categories UncategorizedLeave a comment

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s