### Popular Book Reviews

 Claude Shannon - Information Theory And More
Written by Historian
Wednesday, 13 March 2013
Article Index
Claude Shannon - Information Theory And More
Bell Labs
The bit, the nit, the dit and the Hartley

That is:

Information in bits = Sum over all messages i of -Pilog2Pi

This is the measure of information that Shannon invented in 1948. He then went on to publish a series of papers - two in 1948 and one in 1949 - that presented the subject of information theory at a level of completeness that is surprising and an amazing achievement. Other people had been trying to find a measure of information for some time and Shannon goes and not only finds one but he then writes down its complete theory in only a couple of years.

You can go back to the original papers and they read like an authoritative text book on information theory - not research documents feeling their way towards an idea. He started the study of coding theory covering the effect of noise, bandwidth and power, optimal codes, error correcting codes, data compression and all things binary..

Shannon also solved the radio transmission problem described above by stating what is now known as the Shannon- Hartley law which says that given a bandwidth W and a signal to noise ratio of R then the fastest that you can reliably transmit is

B=W log(1+R)

bits per second. In other words if you reduce the bandwidth or try to send data too quickly then you get an increasing number of errors - which is what anyone who has used a high speed modem over a telephone line or a digital phone over a weak connection will tell you! As the noise on the line gets worse the modem drops back to work at a slower speed and so reduce the number of errors.

## The bit, the nit, the dit and the Hartley

Of course the unit of information that Shannon invented is universally called the "bit" standing for binary digit. However, there are other units of information.

If you take the log in Shannon's equation to the base e, i.e. a natural logarithm, then you get a measure of information called the nat or nit - I am completely serious here!

If you take the log to the base 10 then the unit is called the Hartley after R.V.L Hartley who tried to work it all out before Shannon but didn't have the benefit of thinking about binary numbers. The Hartley is also known as the dit.

You might also like to know that 1bit is .69 of a nat and .3 of a dit - fancy describing something as storing .69MegaNats or .3MegaDits.

All I can say is that I'm glad Shannon got there first!

Without Shannon it is doubtful we would still all be using decimal computers or be ignorant of the basic laws of data transmission and compression. Practical engineering has a way of dealing with issues as they arise  so someone would have come up with all the right ideas and theories, if only in retrospect. In this sense it is too strong to call Shannon the father of the bit but he told us all about it before we knew just how important it all was.

A Symbolic Analysis of Relay and Switching Circuits

#### Related Articles

The Logician and the Engineer

Coding Theory

Introduction to Boolean Logic

How Error Correcting Codes Work

Information Theory

 Herman Hollerith and the punch cardIn the field of business computing one man can be credited with inventing automatic data processing, but these days his name is hardly known. You might even call Herman Hollerith, the forgotten giant  [ ... ] + Full Story Towards Objects and Functions - Computer Languages In The 1980sIt is difficult to imagine a time when programming didn't involve objects and today we expect most languages to have some aspects of object orientation, yet these ideas had to be invented. The story o [ ... ] + Full Story Other Articles

<ASIN:0691151008>

<ASIN:0252725484>

<ASIN:0486665216>

<ASIN:0471241954>

Last Updated ( Sunday, 17 March 2013 )