The Computer - What's The Big Idea?
Written by Mike James   
Thursday, 08 October 2020
Article Index
The Computer - What's The Big Idea?
Babbage
The big idea

The Essential Idea

So far we haven’t really defined what the essential computer idea is.

We’ve talked around it, said what it isn’t but it still remains slightly elusive, slightly shadowy.

The reason is that it really is quite a subtle idea and it is difficult to explain it without reference to the particular technology used to realise it.

To take the example of George Stephenson and the “moving machine”, you can see that it is easier to give examples - carts, chariots, steam engines, cars and so on - then it is to capture the essence of what a “moving machine” is.

A computer clearly has to have a program - i.e. a list of instructions that controls what it does and those instructions have to be able to directly influence what the system does next. That is you can't imagine a little human, a homunculus, sitting inside the computer reading the program and pulling levers. The program has to be the lever and it has to do the pulling directly. That is in a computer there has to be a direct connection between the program and what happens i.e. it is part of the mechanism. This is the most difficult part of the idea to grasp.

For example, if the computer has a instruction which retrieves a value from one memory location and stores it in another memory location then the instruction has to actually cause the computer to do the move. That is when the instruction is executed its representation, its instruction code, has to cause the memory location to be accessed and the value moved to a new location. It isn't just an instruction telling the machine what to do it actually makes the machine do it.

However this isn't enough. There are machines that seem to be capable of being programmed but they don't have enough flexibility and power to compute everything. That is they are not as powerful as a Turing machine.There are algorithms that the machine cannot execute even though a Turing machine can.

So to be a computer, a universal computer, a programmable device has to be equivalent to a Turing machine. In the jargon it has to be Turing complete. 

To show that a computational device is Turing complete can be difficult but the most common way of doing the job is just to demonstrate that the device can simulate or behave like a universal Turing machine or that it can do the same computations as a system all ready known to be Turing complete. 

The Principle Of Computational Equivalence

So how common are Turing complete systems?

The answer is that they seem to be very common. Even very simple computing systems have been proved to be Turing complete. Stephen Wolfram has even postulated the principle of computational equivalence which while it is difficult to pin down precisely:

"Almost all processes that are not obviously simple can be viewed as computations of equivalent sophistication"

seems to say that just about everything that gets over a basic stupidity test is capable of universal computation. The differences are only in how efficient the system is at the computation you are interested in. 

So for example the human brain and the weather system on the planet earth are both computationally equivalent (probably) and both are universal.

This sounds odd but what it means is that you could perform a computation by setting up the human brain in a state and then let it develop. You could also do the same thing with the weather system, set up a weather state and allow it to develop to run the program, but you might have to wait a little longer for the result and you might get wet along the way.

What differs between the way that the two systems operate is merely the way the program is encoded and the mapping of inputs and output. They are both universal for computation and equivalent to a universal Turing machine.

The two systems are computationally equivalent and universal and what Wolfram is saying is that this is rule rather than the exception. As long as a system isn't clearly so simple that its behavior is restricted to the trivial then it probably is universal.

So now we have a wonderful abstract definition of what is and what is not a computer but in the real world the engineering matters. It took many years and many clever ideas to turn the crude internal combustion engine into something that cruises the motorways at the speed limit.

The same is true only more so of the computer.

Turing may have had an imaginary computer but you couldn't use it to play space invaders or run Linux.

To make the modern computer work you need lots of clever ideas to determine how to implement Babbage’s mill, store and input-output devices.

Beyond this you also need to know how to make the symbols do what you want them to and do it reliably. Both involve great ideas that are essential to understanding what your computer can and cannot do today or ever.

anaengine

Related Articles

What is a Turing Machine?

The Universe as a Computer

A New Computational Universe - Fredkin's SALT CA

A Computable Universe - Roger Penrose On Nature As Computation

A New Kind of Science Is Ten

It's life but not ...

Brain-like computing?

How Memory Works

 

What Programmers Know

knowcover

Contents

  1. The Computer - What's The Big Idea?*
  2. The Memory Principle - Computer Memory and Pigeonholes*
  3. Principles of Execution - The CPU
  4. The Essence Of Programming
  5. Variables - Scope, Lifetime And More*
  6. Binary Arithmetic
  7. Hexadecimal*
  8. Binary - Negative Numbers*
  9. Floating Point Numbers*
  10. Inside the Computer - Addressing
  11. The Mod Function
  12. Recursion
  13. The Lost Art Of The Storage Mapping Function *
  14. Hashing - The Greatest Idea In Programming
  15. Advanced Hashing
  16. XOR - The Magic Swap*
  17. Programmer's Introduction to XML
  18. From Data To Objects*
  19. What Exactly Is A First Class Function - And Why You Should Care*
  20. Stacks And Trees*
  21. The LIFO Stack - A Gentle Guide*
  22. Data Structures - Trees
  23. Inside Random Numbers
  24. The Monte Carlo Method
  25. Cache Memory And The Caching Principle
  26. Data Compression The Dictionary Way
  27. Dates Are Difficult*
  28. Sequential Storage*
  29. Magic of Merging*
  30. Power of Operators
  31. The Heart Of A Compiler*
  32. The Fundamentals of Pointers
  33. Functional And Dysfunctional Programming*

* Recently revised

raspberry pi books

 

Comments




or email your comment to: comments@i-programmer.info

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

Banner


Hexadecimal

Hexadecimal is the most common way of displaying the raw data sitting in a machine's memory, but if you are not familiar with it you might ask "What the hex..?"



Cache Memory And The Caching Principle

The caching principle is very general but it is best known for its use in speeding up the CPU. We take a look a the basics of cache memory, how it works and what governs how big it needs to be to do i [ ... ]


Other Articles

 



Last Updated ( Thursday, 08 October 2020 )