|The Memory Principle - Computer Memory and Pigeonholes|
|Written by Harry Fairhead|
|Friday, 22 January 2016|
Page 1 of 2
We discover why computer memory can be likened to pigeonholes and even include instructions for you to build your own memory device.
It is difficult to recapture the amazing moment when you finally understand how a computer works.
I don’t mean the understanding of a particular computer, its specific peculiarities etc, but the general understanding of how this wonderful trick works and what a simple trick it is.
Now the worrying part here is that some readers will have passed this moment of truth many years ago and will have forgotten the pleasure it gave them.
Some readers will think they have grasped it when really they haven’t. And finally of course there are the readers who know they don’t know what I am talking about at all!
This all makes for a big communication problem made worse by the fact that, to give you the underlying theory, I have no choice but to talk about particular realisations of it.
Let me give you a short example of this mismatch of world views.
The problem is very similar to thinking about the way a human might work. You can think of a human as a sort of machine in which a tiny human lives - presumably up in the head. The tiny human looks out thought the eyes and probably pulls leavers to make the real human do things. This is the homunculus theory of how humans work and it is clearly nonsense because you simply end up with an infinite regression of ever smaller humans pulling ever smaller levers.
However humans work it isn't by having another smaller human living inside.
"Infinite regress of homunculus" by Jennifer Garcia
The same sort of explanation tends to be used when it comes to computers only in this case the error is far less obvious. People often describe how a computer works by imagining a small human inside the circuits looking to see what has to be done and doing it. Of course this isn't how computers work and it isn't even close. In fact the use of this sort of argument can completely hide the wonderful way that computers actually work and mystify the beginner.
Let me give you an example.
A long, long time ago, in the days when computers had valves, kept people warm and lived in huge buildings (one computer per building), I went to a “Young Person’s Lecture”. It was all about the new (then) art and science of the computer. I knew as little about this subject as it is possible to know, despite having written my first six-line Fortran program only a few weeks earlier.
The lecture was interesting and it had lots of pleasing “lantern slides” (a sort of early LCD projection unit) and the man pointed to a blackboard (like a whiteboard but black) with a stick (like a laser pointer but made of wood).
I found the whole thing really interesting, but I sat up most in the section where he promised to tell me how computer memory worked. I knew a tiny amount about memory – it was where my six-line Fortran program lived before the computer obeyed it – so this was my chance to find out how it worked.
The lecturer showed a slide of a large wooden construction something like a bookcase but divided up into compartments which he called “pigeonholes” and, yes, you could see that a family of pigeons might want to take shelter there.
He then went on to describe how each pigeonhole had an “address” and this enabled someone to store a pigeon at a particular “location” and then “retrieve” it. “This is how a computer’s memory works” is the final sentence I remember.
I left the lecture in a state of shock and stumbled home. I don’t think I recovered from the idea for many months and certainly Fortran programming was never the same again – possibly to this very day.
Now my guess is that you are probably thinking that my shock was due to thinking that they kept pigeons in computers?
No, that wasn’t it at all.
The real source of my trauma was that I could see that a set of pigeonholes could be described as a memory that things could be stored in and retrieved from. I could also see that my school bag fulfilled the same role. I could also see that if I left a light on in a room it “remembered” the fact that I had done it.
All of these esoteric ways of looking at the obvious did nothing to help me understand how computers, and computer memory in particular, worked. Yet the man who knew clearly thought it did, it should and it would.
To him the picture of a set of pigeonholes clearly represented the functional principle of a computer memory - but to me it was just a set of pigeon holes and how it could ever be part of something larger was, and still is, a mystery.
There really is a bigger idea here than storing pigeons.
After a spending the intervening years thinking about why the pigeonhole model of computer memory just didn't work the answer was presented to be courtesy of Babbage and his wonderful machine.
When Babbage moved on from his Difference Engine, which is best described as a special purpose calculator, to his real computer, the Analytical Engine, one of the things he invented was the “memory principle”.
People had used pigeonholes before to store and retrieve things but this didn't mean they had invented computer memory. Babbage’s machine needed to store decimal numbers and the first part of this problem is easy to solve by anyone familiar with the pigeonhole.
All he did was to take a stack of wheels and the number stored in his memory location is given simply by the position of the wheels – the units wheel, the tens wheel and so on.
By having more than one stack of wheels his machine could store more than one value. If you call the first stack A, the second B and so on then storing and retrieving numbers is just a matter of saying “store 10 in A”, “retrieve the value in C” and so on. You can see that just like the pigeonholes each memory location has an address, A, B, C and so on and a stored value.
The analytical engine has no pigeons - anywhere.
An early punched card
This is where the account of computer memory usually stops and hence why pigeonholes and other such simple analogies are often used to explain it.
However, it leaves out perhaps the most important feature of a computer memory that makes it different and so much more than a set of pigeonholes.
|Last Updated ( Friday, 22 January 2016 )|