Page 1 of 3
Was the ABC the 1st Computer?
If you are at all interested in the history of the computer you can't help but debate the question of who actually invented the first machine? It isn't a particularly meaningful question because reality usually cannot be summarised by a neat list of who invented what. There is no doubt that Babbage conceived of the idea of a computing machine long before the technology was right - but after that there are many competing claims to have built the first computer.
When the technology started to make it possible it seems that a number of people reinvented Babbage's idea. The real problem is that it is difficult to know what to accept at the first real stored program computer. However, if you really must have an easy clear cut answer why not just accept the rule of law - John V Atanasoff is the inventor of the computer and that's a legal ruling.
John Vincent Atanasoff (1903-1995)
Atanasoff's father was a Bulgarian immigrant to the USA. He worked as a mine engineer in Florida and he and his wife, a school teacher, encouraged their son's early interested in mathematics. At nine years old John discovered the slide rule his father had just bought and played with it as his most fascinating toy. Of course the slide rule was a simple analogue computer.
A slide rule
John studied physics and chemistry and read everything he could lay his hands on. By the time he reached High School he had decided to become a theoretical physicist. In the end he went to the University of Florida and studied electrical engineering. A graduate studentship at Iowa University followed and he gained a masters degree in maths. Then on to the University of Wisconsin to complete a PhD. His thesis was on the dielectric constant of helium - about as far from computing as you can get - but he studied electronics in his spare time for fun. Eventually he became associate professor of maths and physics at Iowa State College.
A machine to solve equations
Like many scientists of the time Atanasoff had big problems with doing calculations. He organised his graduate students into teams working with mechanical Monroe calculators but solving equations was still too slow a task. He thought of using mechanical analogue computers of the sort that were becoming important due to the work of Vanevar Bush but these could only solve ordinary differential equations - and Atanasoff wanted to solve systems of partial differential equations. In short he needed a powerful digital computer. He considered using mechanical calculators and even started to modify an IBM calculator but IBM complained about the damage to their leased machine! His wildest scheme was to use thirty Monroe calculators driven by a single shaft rather than hand cranking each in turn!
Atanasoff knew what he needed was a digital computer but he didn't know how to go about building one. He was in the habit of going for a drive to help solve difficult problems and that's exactly what he did one winter evening in 1937.
"I went to assuage my internal torment on computing" he recalled many years later. The eighty mile journey didn't bother him. "I was very frustrated... By driving fast I could force myself to give attention to driving and not thinking."
At last he stopped as a roadhouse and ordered a bourbon and water.
He sat for three hours over his drink, and came to four conclusions. The first was that any digital computer should be built using electronic components. This seems obvious from our vantage point but then electronic components were unreliable and all computers were mechanical or at best electro-mechanical. He reasoned that the unreliability of electronic components was more than made up for by their speed.
Secondly he concluded that binary was the base to work in. Again this seems obvious but all the computers of the day were decimal machines. Atanasoff's mother had taught him about number bases back in the days when he played with a slide rule. Now he realised that both storage and computation were easier in base 2.
His third idea was to make the machine serial in the sense that calculations could be performed a bit at a time. This made it possible to build the machine using limited hardware without limiting the number of bits used to represent the numbers.
The final idea was much more specific and technical. He realised that it would be possible to store a bit as a high or low charge on a capacitor. A simple idea but the problem is that the charge leaks away. Atanasoff solved the problem using a regeneration process he called "jogging". This was the forerunner of the dynamic memory that all of our present day machines rely on.