|Principles Of Execution - The CPU|
|Written by Harry Fairhead|
|Thursday, 13 October 2022|
Page 1 of 4
The real complexity of any computer system resides in the processor, but do you know how it works? I mean how it really works? How does the code that you write turn into something that does something? When you know how, it's not magic - just a matter of "fetch" and "execute".
What Programmers Know
* Recently revised
So far we have looked at the overall workings of a computer and specifically the memory principle. Memories are devices that when given one input automatically produce an associated output. The association is set up when the data is written to the memory. The memory principle and memory mechanisms are fascinating but we need more we need something that can execute instructions making use of the memory - we need a processor.
The processor is quite another level of difficulty.
The processor is the computer
There really is no question of the validity of this assertion.
If you don’t believe me try running a program written for a PC with an x86 processor on a Raspberry Pi or an Android device with an ARM processor. There is no simple fix that can make a program designed for one run on the other - not matter how simple it is.
The point is that computers with different processors are different – computers with the same processor are just faster or slower.
The details of memory management and caching my be impressive but the real complexity of any computer system resides in the processor and it is time to look more closely at how it does what it does.
Even if you think you already know you still might find the explanation interesting. The reason is that many books and courses don’t really tell you the whole story. They stop short and leave you with a sense that the processor is somehow magic even though you know the rough outline of how it all should work.
Elsewhere we have discovered that what makes a computer is the intimate connection between processor and memory. When the processor places an address on the address bus a particular memory location is selected and either stores the data on the data bus or places the data stored in the location on the data bus.
Notice that this isn't magic. There isn't a little humanoid that goes and finds a particular memory location by address and then retrieves the contents for the CPU. The action is as automatic as a key in a lock. The CPU puts the address on the address bus and this selects and activates a particular memory location. The read/write line sets the memory location's behavior and it either places its content on the data bus or it "latches" or stores the contents of the data bus.
This might well be the major operating principle of a computer but it leaves out what the processor actually “does” with the data.
After all it is called a “processor” so presumably it doesn’t just store and retrieve bit patterns. We already know how binary patterns can be used to represent numbers and we know how Boolean logic can be used to manipulate them – with addition and subtraction.
But this is only part of what goes on. When you first start to consider the workings of the processor it is usually arithmetic that the focus falls on. The reason is that we often, mistakenly, think of computers as “computers” but for the vast majority of the time a computer is actually doing something other than arithmetic.
Once you start looking a little more closely the magic seems to be more to do with how this lump of silicon, or whatever it is made from, can obey the commands in a program. How on earth does it look at the next instruction in a program, work out what it means and then arrange the immutable hardware to do it? Software may be soft but hardware is hard and it doesn't change depending on what the instruction wants it to do.
Once again there is a tendency to think of a little humanoid sitting where the processor is waiting for the next instruction to appear and then doing whatever it commands. This is, of course not how it happens and it is all just as automatic as the memory storage and retrieval.
The “trick” that the processor performs seems very complex but it is all based on building the complex from the simple and the very regular – but isn’t this always the principle when it comes to computers?
The first thing a processor needs is some way of keeping track of where it has reached in the program. This is using a single internal memory location, usually called the “Program Counter” or PC – and it doesn't count programs! All internal memory locations within the processor are called “registers” for historical reasons and to indicate that they are generally just a little more than simple memory locations. For example, the PC register has two operations that it can perform. It can be initialized to a set value and it can be incremented, i.e. it can add one to the value stored in it.
|Last Updated ( Thursday, 13 October 2022 )|