|The Trick Of The Mind - Little Languages Arithmetic|
|Written by Mike James|
|Monday, 07 March 2022|
Page 3 of 3
Arithmetic Expressions and Programming
Arithmetic expressions are an important part of every programming language. In the early days of computing, converting arithmetic expressions into actions was a big problem for the same reasons that children struggle with them. Clearly the problem is interpreting the expression so as to get the correct order of operations. The arithmetic expression may not be written in the order in which it has to be carried out, but to get the result you have to convert it into a set of operations carried out one after the other.
The way that the different operations have different priorities makes it difficult to convert an arithmetic expression into a simple sequence of instructions that perform the computation. Returning to our example:
the sequence of instructions needed to perform this computation is:
multiply 8 by 2 divide the result by 4 add 1 to the result subtract 3 from the result
The whole process of working out an arithmetic expression is actually about converting the expression into a sequence of operations that are in the correct order and this order is not the order that the operations are written in the arithmetic expression.
A human has to do this job to get the right answer and so does a computer if it is to automate the process. We are so used to the idea that we can give a computer an expression like 1+8*2/4-3 and get an almost instant answer that we tend to not notice that this isn’t a trivial matter of doing the sums. The computer has to be able to convert the expression into a set of instructions that are not necessarily in the same order as the operations in the arithmetic expression – and in general this is a very hard problem.
In the early days of computers the problem was solved by getting humans to do the job. Early programmers were given arithmetic expressions and had to convert them to sequences of instructions that gave the correct answer. They had to take the expression and reduce it to the sort of list of instructions given above. This was time-consuming and error-prone and the obvious solution was to get the computer to do the job, but this proved to be difficult, very difficult.
Some computer languages, the best known is Cobol, avoided the problem by getting the programmer to write all arithmetic as a sequences of operations that were in the correct order in the first place, but this was just as inefficient. It worked for most financial calculations where the arithmetic expressions were simple, such as “multiply price by tax rate to get tax amount”, but for scientific work it was clearly a costly burden.
Eventually programmers worked out how to convert any arithmetic expression into the correct sequence of operations and FORTRAN, the first modern computer language was born. The name FORTRAN stands for FORmula TRANslation, so you can see that this conversion of expressions, i.e. mathematical formulas, was really at its core. It was created by a team led by John W. Backus at IBM and was an immediate success, leading on to all of the computer languages we use today. FORTRAN was released to the world in the early 1950s and this seems a very short time ago for something so fundamental to the use of computers.
You may have noticed that we have slipped in an idea that deserves more comment. FORTRAN is a computer language that allows general arithmetic expressions as commands and somehow these are converted from their non-sequential order into a set of sequential instructions for computation. Notice also that it is the computer that does this conversion and in turn this means that it is a program that actually does this conversion. This program is generally called a compiler because it compiles a list of instructions needed to implement the arithmetic expressions and anything else in the FORTRAN program into a set of sequential instructions. The task that a human once performed, i.e. the conversion of arithmetic expressions into instructions, is now done by a computer – a very commonly repeated story.
Why Are Arithmetic Expressions Like They Are?
You don’t really need to know why arithmetic expressions use priorities to express calculations, but it is interesting. There is a sense in which arithmetic is written as it is because mathematicians just decided that it should be. If it is a human convention that makes things difficult for students and computers alike, why not just change it? The answer is that the reason for writing arithmetic in the way that we do is deeper than just doing calculations. It isn’t entirely arbitrary and it has huge benefits when things become a little more advanced.
To simplify this discussion, let’s get rid of subtraction and division – they don’t really exists. Subtraction is just the addition of a negative quantity. That is 2-1 is the same as 2+(-1). Division is just multiplication by an inverse. That is 2/3 is the same as 2*(1/3).
Now consider an expression like:
3*2+3*4 = 6 +12 = 18
this gives the same result as:
3*(2+4) = 3*6 = 18
We have noticed that we are multiplying both the 2 and the 4 by 3 and we have opted to do the addition first and save ourselves a multiplication by “pulling the multiply out of the bracket”.
That a*(b+c) is the same as a*b+a*c is called the distributive law of multiplication and it is the reason we prefer to think of multiplication as having a higher priority than addition. If this wasn’t the case and we evaluated 3*2+3*4 strictly left-to-right:
3*2+3*4 = 6 + 3 * 4 = 9 * 4 = 36
then the distributive law wouldn’t be true.
To make it true in this left-to-right world you would have to write:
a*(b+c)= a*b + (a*c)
which isn’t as pretty. More to the point, it isn’t a good way to think about it and if arithmetic expressions worked in this way mathematicians would find simplifying expressions and doing algebra much less natural. So millions of students have suffered BODMAS simply to make algebra easier for mathematicians.
In other words, if you want to think of:
a*b + a*c = a*(b+c)
as “pulling a out into a bracket” then you need to make multiplication a higher priority operation than addition.
In chapter but not in this extract
The Computer - What's The Big Idea?
The Trick Of The Mind - Programming & ComputationalThought
Buy Now From Amazon
To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.
or email your comment to: firstname.lastname@example.org
|Last Updated ( Monday, 07 March 2022 )|