History of Computer Languages - The Classical Decade, 1950s
Written by Harry Fairhead   
Article Index
History of Computer Languages - The Classical Decade, 1950s
Fortran
The Future

In the first of a series of articles about the development of computing languages, we look at the struggle to create the first high level languages.

If you are interested in the development of computer languages see the other parts of this series covering the 1960s, 70s, 80s.

The Pioneer Spirit

The ten years from 1950 saw the development of the first computer languages. From machine code to assembler was a natural step but to go beyond that took five years of work and the production of Fortran 1.

At the close of the 50s the programming world had the trinity of Fortran, Cobol and Algol and the history of computing languages had completed its most critical phase. 

The history of computing is usually told in terms of the hardware and as a result we tend to think of progress as how much smaller, faster and cheaper computers are. Computing isn't just applied electronics and there is another side to the coin. Computing is also about programming and the history of programming languages contains much of the real story of computing.

Although computer hardware has changed dramatically in a very short time its basic principles have remained the same. So much so that even Charles Babbage (1792-1871), the father of the computer, wouldn't have too much difficulty understanding an IBM PC. He might not understand transistors, chips, magnetic recording or TV monitors but he would recognise the same operational principle of a CPU working with a memory.

You could say that the only real difference between Babbage's computer and today's computer is the technology used to realise the design, and you would only be stretching the truth a little. However the same statement wouldn't be true of software. Babbage probably didn't even have a clear idea of programming as something separate from the design of his machine, a failing repeated by hardware designers ever since!

The first person you can pin the accolade of "programmer" onto is generally agreed to be Augusta Ada, Countess of Lovelace (1815-52). This unlikely sounding character helped Babbage perfect his design for a mechanical computer and in the process invented programming - but there are those who would argue that her role was much less.

Her contribution was recognised by the naming of a computer language, Ada, after her.By all accounts she was the archetypal programmer in more ways than one - she lost a great deal of money after dreaming up a crazy gambling algorithm! 

The most important observation is that Ada did seem to grasp the idea that it was the software and the abstract expression of algorithms that made the machine powerful and able to do almost anything.

The point of all of this is that programming and programming languages have a life of their own separate from the hardware used to implement them. Without hardware they would be nothing more than abstract mind games but once you have a computer, no matter how primitive the implementation, most of the difficult and interesting ideas are in the development of the software.

Put as simply as possible a program is a list of instructions and a computer is a machine that will obey that list of instructions. The nature of the machine that does the obeying isn't that complex but the nature of the instructions that it obeys and the language used to write them is.

Seeing the problem

If you can program, even a little bit, the idea of a programming language seems blindingly obvious.

This makes it very difficult to understand that there was a time when the idea of a programming language was far less than obvious and even considered undesirable! Today programmers needn't know anything about the underlying design of the machine. For them a program is constructed in a programming language and it is that language that is reality. How the machine executes this program is more or less irrelevant.

We have become sophisticated by moving away from the simplistic computer hardware. However this wasn't the case in the early days. Early programmers worked in terms of "machine code" and this was virtually only one step away from programming with a soldering iron! It also has to be kept in mind that most programmers of the time were the people who had a hand in building the machine or were trained in electronics. This confusion between electronic engineering and programming persisted until comparatively recently.

A machine code programmer sees the machine as a set of numbered memory locations and the operations that can be carried out are given numeric codes. The program and the data have to be stored in memory and the programmer has to keep track of where everything is.

For example, a computer might have two operation codes

01 x y

meaning add the contents of memory location x to the contents of memory location y and

02 x y

meaning subtract contents of memory location x from the contents of memory location y. A program would then be written as a long list of numbers as in 01 10 15 02 18 17 and so on.. This program means add memory location 10 to 15 and then subtract memory location 18 from 17 but this is far from instantly obvious from a casual glance. To a programmer of the day however, it would have been like reading standard English.

Because they produced programs every day this list of operation codes was fixed in their mind and it was second nature both to read and write such programs. 

At this early stage programming was the art of putting the machine instructions together to get the result you desired. 

Machine Code Problems

Machine code had, and still has one huge advantage - because it is something that the machine understands directly it is efficient.

However its disadvantages are very serious and easily outweigh considerations of efficiency. It was difficult to train machine code programmers, it was far too easy to make mistakes and once made they were difficult to spot.

To understand machine code you have to know how the machine works and this required a certain level of sophistication and hardware knowledge. At first this didn't matter because the programmers were the people who built the machine and hence they found it all perfectly natural. However as soon as more programmers were needed the difficulties of explaining machine code to potential programmers who knew nothing about hardware became apparent and a real problem.

The problem with finding errors in machine code was simple to do with the lack of immediate meaning that the codes had. After you had been programming for a while your brain started to read 01 say as Add but even then reading 01 10 24 as Add memory location 10 to memory location 24 didn't give any clue as to what memory locations 10 and 24 were being used for. Machine code is easy for machines to read but always difficult for humans to understand. 

Assembler

The solution was to make use of short but meaningful alphabetic codes - mnemonics - for each operation. So instead of writing 01 10 15 a programmer would write ADD 10 15. Of course before the computer could make any sense of this it had to be translated back to machine code.

At first this was done by hand, but some unknown pioneer had the idea of getting the computer to do this tedious job - and the first assembler was born. An assembler is a program that reads mnemonics and converts them to their equivalent machine code.

It is difficult to say who invented the assembler, presumably because it was invented in a casual sort of way in a number of different places. An influential book of the time "The preparation of programs for a digital computer" by Wilkes, Wheeler and Gill (1951) is generally thought to be responsible for spreading the idea and the authors are also often credited with the first use of the term "assembler" to mean a program that assembles another program consisting of several sections into a single program. In time the term was restricted to cover only programs that translate readable symbols into much less readable numeric machine code.

The first assemblers did just this but once the idea of getting the machine to translate a symbolic program into numeric operation codes it didn't take long to think up other jobs it could do to make programming easier.

The best and most important idea was the introduction of symbols used to represent the addresses of memory locations. For example using such symbolic addressing a programmer could write ADD sum total and leave the assembler to work out where in memory sum and total would be stored.

Believe it or not this is a very sophisticated idea and needs a lot of additional programming in an assembler to make it work. Essentially you have to set up a table, the symbol table, where the names are collected and assign memory addresses to each symbol as the program is translated.

This introduced three important ideas -

  • the concept of what later became known as a symbolic variable or variable for short,
  • the idea of a symbol table i.e. a store where you could look up symbols and find out what they represented, machine code, data or address of data. 

and perhaps most important of all

  • the notion that programmers could use their own art to make programming easier.

This was the start of the language explosion.

<ASIN:0201895021>

<ASIN:0071388605>



Last Updated ( Sunday, 11 June 2023 )