This order of evaluation problem was what faced the IBM team who had been assembled (no pun intended) to make the assault on computer science’s equivalent of getting a man on the moon - the automatic translation of arithmetic expressions.
The team, under the leadership of John Backus, was to construct a compiler for a computer language that could compile any arithmetic expression that a programmer cared to throw at it.
To do this they had to invent a way of writing down and making use of the rules of grammar for artificial languages such as programming languages and mathematical expressions. They invented the production rules as described above and a method of using the rules to parse the language and generate the correct assembly code.
The trick is to convert the expression into a syntax tree which makes clear the relationships between the operators and their priorities. Once you have the syntax tree then you can use it to generate the operations in the correct order by simply "walking" the tree. For a more detailed explanation see: Grammar and Torture.
The six-month project actually took two years before the first compiler was available in 1956 and it took until April 1957 before working compilers were distributed to customers. It consisted of 25,000 lines of machine code on a magnetic tape and it was distributed, complete with bugs, to every IBM 704 installation.
The language was called FORTRAN standing for FORmula TRANslation and when they finally got it to work it revolutionized programming and made IBM the number one computer company for decades.
Notice that the name of the language was derived from FORMula TRANslation. Even though the language had lots of other new features it was the fact that it could successfully convert an arbitrary arithmetic expression or formula into machine code that was its great claim to fame. Once you work out how to do this particular job the rest of the compiler was a matter of comparatively simple book keeping.
Exactly how does computer memory work? What is surprising is that it still works in more or less the same way as when Babbage designed his Analytical Engine or the IBM 360 accessed core memory. [ ... ]
Despite the fact that pointers have been long regarded as "dangerous" they are still deeply embedded in the way we do things. Much of the difficulty in using them stems from not understanding where th [ ... ]