Author: Paul J. Nahin Publisher: Princeton University Press Pages: 244 ISBN: 9780691151007 Audience: Electronics enthusiasts interested in the origins of computering Rating: 4 Reviewer: Harry Fairhead
George Boole and Claude Shannon may be from different centuries and different countries, but they worked on the same sorts of problems. Arguably they built the foundations for modern computing.
This is not a biography, even if its title and subject matter might lead you to believe it is. Instead it is a slightly odd journey through the ideas of logic and information that underpin modern computing.
What is odd about the journey is that the route is via the original ways that Boole and Shannon thought about and realized their ideas. So instead of having logic described in terms of modern logic gates, we have a description of how relays work. If the intention is to teach the reader computing fundamentals this doesn't seem like the best approach. In fact it is difficult to know exactly what the intention of the book is, apart from to describe some of what Boole and Shannon did.
The book opens with a chapter that outlines what you need to know to read the book. This is completely misleading and probably unnecessarily off putting. It seems to suggest that you don't need to be an electronics genius  which is true  but you do need to know about matrix multiplication and a bit of basic electricity. Next we have sort of mini test which explains that you can implement an upward convex function of shaft angle using potentiometers. Yes the term "upward convex" is introduced on page 2 and this is about as off putting to the beginner as it possibly could be. The example is subtle and first proved by Shannon. It really isn't the sort of thing a reader known only a little electricity should be expected to tackle and my guess is most wont. This is a shame because it is almost the toughest topic in the entire book!
Chapter 2 gives an outline of Boole's approach to logic which isn't quite what we use today. Boole thought in terms that were appropriate for the time  he tried to introduce a logic that was like arithmetic but based on set theory. This probably isn't a good way to learn about logic but it is an interesting way of thinking about things.
The next chapter is a potted biography of Boole and Shannon. This is probably the only chapter that will be of interest to the general reader and it is a shame it is in the middle of material that would put them off from progressing this far, or any further, into the book.
Chapter 4 continues the description of logic as thought of by Boole, and this throws in extracts that a reader even with basic algebra would find difficult to read let alone understand e.g.
x(y(U))=y(x(U))
If you stick with it then you do eventually get to the usual logical calculus, but again the pages are littered with lots of equations. This is quite deep stuff and even gets to mention Karnaugh and explain his maps and how they are used to simplify logical expressions.
Chapter 5 moves into a completely different area  relay logic. This explains how relays work and how they can be used to build logic gates. The works its way up from basic gates to bistable latches i.e. memory circuits.
Next we have a look at probability  a subject both Boole and Shannon used. The most important of Shannon's achievements in using probability to create information theory is more or less sidelined to take a closer look at how probability works, how it is like logic and how to work with it.
The theme continues in the following chapter were we do get to meet some of Shannon's greatest results  the channel capacity theorem  but without really having the ideas of information theory explained in any great detail. It you already know about information theory this is interesting if not you will wonder what it is all about.The book really treats Shannon as a designer of old fashioned computing hardware rather than as the father of information theory.
From here we move into the area of sequential logic and deep into engineering. Chapter 9 is about Turing machines  and I'm not at all clear why. Shannon might have been interested in AI, but Turing machines don't really have anything to do with any sort of practical computation. Even so they are explained in great detail with state diagrams and we even get to the universal Turing machine, but without really doing justice to its contribution to mathematical logic. The final chapter considers computation beyond Boole and Shannon  quantum computers, reversibility and so one. All topics that arguably could be in a general book on computers, but not really relevant to Boole or Shannon or the subjects they helped to create.
Overall this isn't a particularly well organized book and it isn't aimed at the beginner. To get anything out of it you probably have to be an electronics enthusiast with an interest in computing and where the ideas came from. Having said this I enjoyed it but then I fit the description of the ideal reader. In my opinion this book is going to disappoint a lot of readers expecting a biography and a lot fo readers expecting a light general treatment of logic and information theory.
If you are the, I suspect, rare reader who the book is ideal for then you will enjoy it otherwise you are simply going to wonder what it is all about.
Rapid Android Development
Author: Daniel Sauter Publisher: Pragmatic Bookshelf Pages: 392 ISBN: 9781937785062 Audience: Programmers Rating: 3.5 Reviewer: Mike James
What could "rapid" Android development be all about?

jQuery Mobile: Develop and Design
Author: Kris Hadlock Publisher: Peachpit Press Pages: 304 ISBN: 9780321820419 Audience: JavaScript Programmers Rating: 2 Reviewer: Ian Elliot
jQuery Mobile is an important open source library for creating mobile web sites and apps. Can this slim book provide everything you need to know?
 More Reviews 
