Suddenly, governments in the US and the UK among others, seem to have woken up to the fact that if they are to stay ahead in the modern world then they should teach children how to program.
Personally, I can't understand how programming ever got dropped from the curriculum and my reaction is one of
"Why has this realization taken so long?"
Back in the 1980s it seemed obvious that to stay ahead you had to learn how to program the microcomputers of the day, and there were many educational initiatives that included teaching programming in one form or another.
Then programming computers got squeezed out by lesser skills such as using word processors and spreadsheets. It was assumed that just like a car, most of us would end up driving the machine not understanding its internals.
So teaching programming died as an educational objective.
It didn't much matter because there were enough enthusiastic self-taught programmers who emerged from the 80s personal computer revolution to make up for the general lack of these skills and we could cruise along on their expertise and hard work.
It also seemed that society could rely on the fact that the "right" people gravitated towards programming of their own accord. They took to it as if it was second nature. Why bother teaching something that enough people will teach themselves? It also saved having to find teachers skilled enough to teach something that could be difficult.
Over time, however, computers changed and the motivation to learn programming touched fewer and fewer people. The reason was simply that computer languages became harder and it was an effort to achieve something that matched what you could buy as commercial software.
Put simply. back in the early days you could put together a few tens of lines of Basic and create something to impress your friends; a little later it took hundreds and then thousands of lines to create a 3D action game.
The entry level to programming slowly climbed higher and the initial rewards decreased. Self starting became less common.
So over the years we created computer users rather than innovators.
What is more, not knowing anything about programming or the way computers work made the users very low quality. If anything went wrong they simply froze along with their machine.
Now the powers that be seem to want to change things. They seem to be willing to believe that knowing how the the basis of modern life works might make people more productive. It might just boost the economy enough to pay of the debts that seem to be generally causing problems.
At the same time as all this is happening we have another strange phenomenon - Raspberry Pi. This is a small ARM/GNU Linux box for $25 and it is supposed to bring programming back to the masses.
First let me say that I think that Raspberry Pi is wonderful and I want one, as I'm sure you do, but it isn't going to bring programming back to the masses.
If you read any review of this wonder you will find that all that is talked about is the hardware. Somewhere towards the end you might see a reference to what software it might run, but never more than a short paragraph.
The point is that we don't really need more hardware to get kids interested in computers - we need the right software. Raspberry Pi is yummy but it isn't essential.
You can see classrooms full of computers in most schools. Homes have multiple machines. Portables, laptops and tablets they are all over the place and if you take into account the population of mobile phones then you can see that lack of hardware isn't really the problem.
A piece of hardware like Raspberry Pi is simply yet another way to run the educational software we appear not to have just yet.
If you are going to bring small computers into the classroom then use a PC to generate programs for an Arduino so that electronics and control can be part of the activity. Arduino is a small machine almost as cheap as Raspberry Pi and it really does things that a standard PC doesn't do.
Moving on from the mystery of Raspberry Pi, you have to ask why it is that the computer rooms aren't being used to teach programming already - and I know that in some instances they are. The big problem is working out what to teach. Ask any three programmers what language to use to introduce programming and you get at least four answers.
Modern programming languages have a steep learning curve because they are object-oriented.
When you are first learning to program you have to master algorithmic thinking and learn how to put together loops and conditionals to create a solution. You really don't want to be bothered with bigger ideas like how to organize your code using objects. The problem is that most modern languages are almost aggressively object-oriented. You can't write a program unless you create an object with methods and properties.
Then there are the graphical languages like Scratch, that work by dragging and dropping blocks to build up a program. They seems like a great way to get started, but you still have the transition to make to a code-based language which can be tough.
A different way to look at the problem is to consider application domains rather than initial teaching languages.
There are two exciting application domains that seem to have the same feel as the original microcomputer programming environment of the 1980s.
The second is the mobile phone - which in its full form is a tough area to learn but there is a graphical programming approach to Android in the form of App Creator.
Instead of setting out to teach a language perhaps it would be better to get students hooked by getting them to create web or phone apps. In both fields you can still create something fairly impressive with a reasonable amount of effort.
Finally there is the question of should all children be taught to program?
I have heard it said that this is a bad idea because it forces children who don't have an aptitude into the subject with a huge loss of time and self esteem. What you end up with is a lot of effort to "turn out a failed programmer".
This isn't quite the situation.
Programming isn't just applied technology it is a way of thinking. What makes a programmer different is that they have mastered algorithmic thought and this is worth an attempt even if they aren't going to become an ace programmer in the future.
Being able to thing algorithmically helps with planning, math and problem solving in general. It is a good skill to have and a lot more valuable that "turning out a failed programmer" might suggest.
Yes it is time place programming into the modern school curriculum but to do this we don't need more hardware, we need a clear idea of how it should be taught. On the other hand. if Raspberry Pi generates the same enthusiasm as the microcomputers of the 80s then it will have served a useful role as a catalyst.