The following account is based on personal experience and you can feel free to disagree with it. I can only hope that doing so illuminates your opinion about this strange and amazing thing we do with symbols that is called "programming".
I have taught a lot of people to program from scratch.
Before they had no idea what a program was - after they could at least create a simple program. The range of abilities and the ease with which students acquired the skill never failed to amaze me.
Some took to it so fast that it was like watching something being pushed over the edge of a steep slope. They tumbled so quickly from non-programmer to programmer that the learning experience verged on the chaotic with them impatient to get ahead and sometimes missing something important in the process so looking not quite as smart as they thought they were!
However no problem - they clearly would get the whole subject mastered even if left on their own.
At the other end of the spectrum there were students who just struggled to keep up. They understood the ideas, asked intelligent questions but when asked to write a program they seemed to stall.
Almost as if some bit of their brain wasn't connected to another bit that did the programming. Over a much longer period of time the connection was usually made but it was a much slower process.
So what is it that makes the difference?
Of course I can't be certain that I know because to be sure it would take quite a few psychology experiments and measurements and with all that fun programming to be done who has the time?
But I am fairly sure what the differences are, and there is some research to back up my scientifically unfounded ideas.
It has long been known that the the most important trait as far as predicting how good some one will be at programming is mathematical ability. If you are good at math then you are probably going to be good at programming.
I've also held the view that if you are good at programming you probably should be good at math and if you aren't then you almost certainly confused math with arithmetic - they aren't the same thing at all.
Notice that I'm not saying that you need math to be a programmer or that some how programming is a branch of math. What I am saying is that whatever part of thinking is used in programming it is very similar to what is being applied in math and vice versa. The same is true for some other branches of science but it seems to be related to how mathematical they are.
So what else makes a programmer a programmer?
First off you need a good memory.
Because you have to keep track of variables and the current state of things that have an influence on the area of the code you are working in. Think about it for a moment. You pick up some code and look at say three of four lines that do something.
To make sense of it you not only have to follow the local text but have a clear idea how it relates to the rest of the code. In other words you may be focused on a few lines of contiguous code but to work on it you have to keep the rest of the code in your head in some form or other.
This isn't a simple memory task because it is about understanding the code in the context of the rest of the program - but a good memory helps with making connections between the local text and the rest.
So a good memory helps, but the memory is of a special sort - symbolic and sequential - and the reason that math is a good indicator of programing ability is exactly the same reason - its about the ability to manipulate symbols.
To be good at math you have to have an ability to construct a serial argument often using abstract symbols. You have to be able to start with some facts and reason one clear step at a time to the result. There may be many steps but a good mathematician will not suddenly make a jump that hides some of the steps.
A matter of time
Programming is the same only in this case it is slightly more sophisticated.
To be good at programming you have to acquire an algorithmic thought process. This is like a mathematical proof in that it is a serial argument from start to finish but now the steps include conditional branches and repetition.
Yes its all down to the "ifs" and the "loops".
Algorithmic thought is arguably a specialisation of logical thought to processes in time.
Notice that this means that programming is about constructing causal sequences and this puts into question approaches such as pure functional programming. If you take the time out of programming then what you have is mathematical thought not algorithmic thought. Of course, this is exactly the advantage claimed for functional programming as opposed to procedural programming.
It is embodied in the two different meanings given to a=0 by the functional and procedural approaches respectively.
In the purest of functional approaches it is a statement that a equals zero - always has and always will.
In the procedural approach it is a sequential process - take a and store zero in it, so changing the value it had before.
The difference is profound.
If you are of the opinion, as I am, that algorithmic thought has a lot going for it then perhaps eliminating it isn't the best way to go! We have problems creating good procedural code, but that doesn't mean we should throw the baby out with the bath water.
Programming is about describing sequences of events and this is what makes it a powerful idea.
Algorithmic thought isn't the sole preserve of programming. It occurs in other subjects - engineering, chemistry, law even - but nowhere does it gain such a clear and sophisticated expression. After all programming is about the most general algorithmic thought that includes not only logical connections but explicit repetition. It is not only deductive, as in A implies B, but constructive, as in A produces B. You can think of it as logic but with time built into it.
We work with sequential steps in which the logic flows from one moment to the next. This is of course the reason why many of us find parallel processing, asynchronous code, recursion and continuation difficult to swallow - they all distort the role of time in the algorithm.
To a logician the statement:
is clearly a paradox. To a programmer it is a process that produces a flip-flop behaviour from true to false and back to true again ans so on...
If you take a non-programmer and give them a task and ask them to describe how the task might be completed the result is often a mess. They leave steps out, make vague statements about how one step follows on from the next and generally the result is what you might call open to a lot of interpretation. Give the same task to a programmer and the connection between the start and the end should slot together like a child's building bricks or perhaps a model railroad track.
To program you not only need to be able to get to the solution but to describe in exact detail how you get there.
So a good memory and algorithmic thought - anything else needed to make a good programmer?
If you consider some of the basic tests used to pick candidates for programming jobs - for example bracket matching e.g. is this set of brackets correctly nested
then you have to start to think that an ability to focus or concentrate is a good thing.
Indeed it is our folklore that programmers are isolated types who focus on their programs to the exclusion of all else.
However, if you think about such tasks then they too seem to be related to an ability with algorithmic thought. If you can see what the "micro program" is to check the brackets, i.e run a stack algorithm that pairs up the opening and closing brackets, then it can be applied without much concentration or focus compared to a naive method that attempts to understand the structure.
Of course, as you practice the craft of programming you pick up algorithms that you have used and tend to see new problems in the light of the old. This is quite natural and it makes an experienced programmer faster at getting to an algorithmic solution than a beginner.
On the other hand it also occasionally allows the beginner to see a better way than the obvious and accepted solution. The point is that, while the two main skills are in common, there is a practice effect.
Algorithmic thought is special and if you have had the privilege to master it even just a little then value it.
Above all, don't think that programming isn't an intellectual activity - it is and not everyone has the abilities needed to achieve it easily.
At last year's JavaOne Anil Gaur, vice president of Oracle's Cloud Application Foundation group, announced a survey to help determine priorities for the revised road map for Java EE. The survey r [ ... ]
The amazing success of the Raspberry Pi has had many impacts on the computing world and some of them are yet to be felt. The latest move makes the Pi's OS available for existing desktop machines, and [ ... ]