Page 1 of 27
Click to view bigger version
It looks the same, but such a world of difference. From this one distinction comes the whole problem we have with typing - data typing not the keyboard thing. When is an image of the data the instance of that data? When you read text you are viewing an image but when you edit text you are working with something else. So don't try and patch Linux with Photoshop patch and don't edit your photos with GNU Patch.
More cartoon fun at xkcd a webcomic of romance,sarcasm, math, and language
Is It Worth the Time?
This is where we all go wrong. What programming project, no matter how simple, takes 8 weeks or less? How many programs save as much as an hour a day or one day a week? Clearly programs always use up more time than they save and are inherently pointless exercises. I'm off now to sell my keyboard and take up yak shaving.
Computers may acquire artificial intelligence but we humans don't always have the natural equivalent. You got to feel sorry for those AIs when they meet the stubborn "black is white" belief system that really does characterize the human condition.
If you don't know what the axiom of choice is all about then read: Axiom Of Choice - The Programmer's Guide. More important is the fact we might have missed a good idea. Programming by intimidation - why don't we just take bugs out and execute them as an example to the others? Or why stop there let's execute the bad programs - oh wait that is what we do!
We live in a age when deprecation, non-backward compatibility and breaking changes are the norm when once they were exceptions. I personally blame semantic versioning.
And again xkcd finds another reason to learn to program. How else are you suppose to break everything? I'm sorry but I'm busy compiling a kernel at the moment.
When I look at this xkcd what goes through my mind is
"my current project is to the right of the Excel spreadsheet"
I know I'm not alone.
This xkcd used to be funny, but that was in the days before Bitcoin hit the headlines. It was recently reported that Bitcoin mining could use more electricity than Denmark by 2020 - now that is funny, but not in a good way.
Why haven't more "programmerisms" made it into the vernacular? We say there's a bug even if there isn't a program involved. Why not say "commented" to indicate that something has no effect. We really do need to teach everyone to program if only for the cultural value and the effect on language.
Before you laugh - how long did you claim the program you are working on would take to finish? Were you right? Even close? Next time you are asked just say - a program is never finished. Of course they will then ask how long for a minimal viable product...
If we don't teach everyone, and I mean everyone, to program how are they going to recognize recursion when they see it in an xkcd?
Another good reason to teach everyone, and I mean everyone, to program - conditionals. We use conditionals all the time in everyday life but how many actually understand exactly what they are saying.
This is the real reason we need to teach programming to everyone. Algorithmic thinking brings new ways of looking at the world and deeply influences how you behave - but not necessarily in a good way...
Let's be honest there is a part of every programmer that wants this to be true. The idea that our creation might be something more than a neat UI is a dangerous romantic myth identical to the Frankenstein story.
This golden oldie from xkcd suddenly became relevant again as the news broke that a neural network beat a human expert at Go - see Google's AI Beats Human Professional Player At Go. What is more interesting is to check up on what Mao, Seven Minutes in Heaven and CalvinBall are. It seems that games that involve social interaction are all we have left to ourselves..
Once, not so long ago, programming was about learning a language and some algorithms and then getting on with it. Today we have to negotiate the "stack" and not just one. Technology stacks have grown to become a big problem. Picking one is tough, learning it is tougher and having to give it up for the next fashionable stack is even tougher.
If this joke escapes you (pun intended) then the chances are that you are not a programmer but you could be a backslash...
In Case of Emergency
A computer is also a machine for making work. Thank goodness.
All Adobe Updates
And I thought it was just me who got worried when a package manager needed to be updated. I guess it all goes back to the barber paradox - who shaves the barber?
This is no joke! Can you remember the days when you just sat down and wrote a program? No dependencies, no build server, no source control, no make - no tools! Innocent days before the recursion set in.
Mathematicians - a breed apart. You can't argue with that, but where do computer scientists fit in, and programmers? Are they the same thing? My guess is that programming is applied computer science, which is applied mathematics.
It used to be worse but not much.
I once knew a programmer who used compile time to learn foreign languages and I don't mean the computer kind. He ended up fluent in so many he lost count.
A wish for 2016 - the death of compile time.
Cartoon - Why Compile Time == Play Time?
Watson Medical Algorithm
The horror of AI. It really doesn't matter if it works in a completely rational way this is how any right thinking patient or potential patient thinks it might work.
Xmas is truly the time of the programmer - a tree and a heap what could be more chrismasy.
The Three Laws of Robotics
Its is obvious that programming isn't commutative, it matters what order you write things in, but who would have guessed that science fiction and perhaps writing in general wasn't?
Let us hope that this year's Computer Science Education week isn't booby trapped with this sort of flow chart. Everyone knows that only IoT and hardware types have any use at all for an infinite loop. Perhaps we should ban them.
It is 100 years since an ex-Swiss patent clerk invented a theory that changed the way we view the entire universe. They don't come much bigger than that - and, yes it has to be admitted, patent clerks have had a lot to live up to. If only they would get software patents right.
Most programmers are shocked at what they see when listing the source of Google.com. It is such a minimal page who would have guessed that so much code was needed. Turning the tables, who would have guessed that so little code was needed for a human!
In honour of the The Computer Science Breakthrough Of The Decade we rerun every complexity theorist's favourite xkcd cartoon. Of course everyone knows that the decision problem - can we find a set of appetizers that gets within 5 cents of $15 - is NP-complete, but the optimization problem, which set of appetizers gets closest to $15, is NP-hard. For more enlightenment see: NP-Complete - Why So Hard?
Hands up everyone who thought of virtual reality while reading this. So what is your "never to be" but "must have" technology and don't say hoverboard or flying car?
And GIT isn't alone in this under-use of facilities. If you give a man/woman a hammer then they will use it to hammer in nails. If you give instead a sonic screwdriver - a large proportion will still use it to bang in nails and without switching it on!
Git and GitHub LiveLessons
In these days of Terabyte disc drives it hardly seems worth arguing over the difference between 1024 and 1000 bytes - but it does make a difference.
To find out how much see: What's Up With The Kb?
One of the real mysteries of the universe is - if Lisp is so impressive why isn't it the language we all use instead of the language we all admire from afar?
... but people do!
There are so many world views of hardware and software that don't correspond with what we see as reality that communication becomes difficult. "Why did the wordprocessor just lose my story?" "This is a nice computer it never crashes."
Lots of programmers, well a few at least, are dyslexic and don't ask how that works unless you want a long story. October is international Dyslexia Awareness Month and 5th to 11th is Dyslexia Awareness week in the UK. So if you know a dyslexic remember to spell a word or two for them - correctly, no cheating!
Dyslexia and Programming
Dyslexia Awareness Month Kicks Off
The big problem with the Turing Test is not what is inside the box but that what is inside the box knows it is taking part in the Turing Test. This converts what was a perfectly reasonable scientific test into an adversarial contest more like a trial where dirty tricks and cheating are perfectly ok. The Turing Test
We built this. We built this!
Did we really mean to?
It can't be long now before there is no one left who understands this joke. Segfault? Pointers? If only to protect the feelings of the compiler.
Tech Support Cheat Sheet
Because it's "back to school" in many parts of the world it seems appropriate to remind everyone of the basic flowchart needed to get by. Anyone know how to print out a flowchart? In fact what is a flowchart?
It could just as well be "programmer syllogism". How many times have you thought, or encountered someone who thought that making money on the stock market was just a matter of the right algorithm? Unless of course, wait, yes that's it...
Until the day, that is, that our AI progeny learn how to teach. Teaching us is one thing, but when they move on to teach each other then...
Personally I think that all functional programmers should be restricted to the vocabulary of simple.wikipedia.org - yes I'm looking at you monads, currying, partial evaluation, trampolining, algebraic type systems, Curry-Howard correspondence ...
And the good news is that Randall Munroe, author of xkcd has a new book - Thing Explainer: Complicated Stuff in Simple Words which uses only the ten hundred most common words.
I love my Roomba and I don't beat it up when it repeatedly head butts a table full of glasses, not even when it managed to knock one off and break it. I know it doesn't mean it because it makes up for it by cleaning up. Now I'm conflicted - do I release it into the wild? The hover over text suggests that I have to, but I really don't want to.
The question, sorry cartoon, for this week is to work out if this really is recursion or is it simple iteration? If you find this too easy what about the hover over text? Extra credit for explaining Ozymandias and the connection to the first programmer and the first monster. Join in the debate here.
Can there be a bigger way to waste time than trying to get some missing Linux facility to work? It starts out so easy with a distro on standard hardware and a package manager but it ends in a time sapping session with gcc and make - and there are all those dependencies to get right...no seriously, there should be a health warning on the box.
This classic xkcd cartoon is another celebration of Donald Knuth's work - see Donald Knuth & The Art of Computer Programming. So why do we count from zero? And is it ever good to count from one? I can think of zero reasons for it...
We have all been there - the deep water with the sharks. It doesn't matter what the system is. The easy change that you didn't really have to make usually goes horribly wrong. What I don't understand is why there is always a moment when you suddenly realize the original system could never have worked in the first place...
Travelling Salesman Problem
Sometimes theoretical results just don't count in the real world. Remember when someone pointed out that route planning was NP hard? So no need to even try to create a satnav then...
Recursion. We live with it but there is still something extra fascinating about physical recursion. Look between parallel mirrors, point a video camera at a screen and, of course, organize a conference about organizing a conference. In this xkcd cartoon we see what happens when you slip a negation into the recursive loop.
This weeks xkcd classic points out that virtual reality, reality reality - its all the same really. A construct of the computational processes that go on inside our heads. You gotta admit it's a great excuse!
It all goes to prove that type conversions are in the eye of the beholder. Some of these seem entirely reasonable to me - but I'm not saying which ones!
It's fun but it's more like a detector for AI experts. The non-AI expert laughs and then worries about the possible coming robot uprising. The AI expert laughs...
Programmers often have "mechanical sympathy" - well as long as the mechanism is code. In general humans are kind to machines, mostly, and don't mind lending them their feelings and intelligence. We have to hope that in the near future that machines learn to do the same.
Why has no one created an app for this? Or perhaps they have and I just haven't sat next to the person who knows about it...
Exploits of a Mom
This is a classic xkcd and it is featured here just to make sure you know it. And have we learned to sanitize our database inputs?
My guess is that you can think of more reasons why average star ratings are bad but spare a thought for their use with system critical apps. A single valid negative may be the only rating you need to see. And the response "could not reproduce" isn't really a defence, is it?
There are many jokes that claim to be \"programmer\" jokes but this is the only one I know that guarantees you won't be amused if you are a non-programmer. So remember, you escape \"handcuffs\" with backslashes - as always.
Well the GOTO has to be considered harmful, but did Dijkstra really have a velociraptor in mind when he made his comment. Can it really be that some of us still don't understand what we are trying to do?
See: The Goto, Spaghetti and the Velociraptor
We were all beginners once, but we also all, well nearly all, went through that dangerous time when we thought we had learned to program and there was nothing, nothing at all, left to learn.
How wrong we can be and how sure we are right!
If you don't get this joke then it is likely that you don't call Linux GNU/Linux and have no idea what the HURD kernel is. If you do then you will realize that 2060 is a hopelessly optimistic date for the completion of GNU/HURD.
See: GNU Manifesto Published Thirty Years Ago
Ah, the perils of big data or data science or whatever statistics is called now. What always depressed me was that it was the "null" hypothesis. I was always cheering on the alternative hypothesis - well it has to be good if it's "alternative", right?
If you know what pointers are and can read the list of numbers then you are probably a C/C++ programmer. A word of advice - don't use "pointers" in your sense in natural or programming languages.
1 to 10
One day when we have a truly high level language, or perhaps lots of them, programmers will not remember what binary is and this will not set us apart any more. Something else will - but not binary.
This week's xkcd cartoon will probably irritate every programmer. The idea that there is a bug in the code is something that bores into your brain and finding a non-programming fix is just not satisfying. Even if you accept that the timer reboot is a quick fix I bet you would start thinking up a shell script to do the same thing without the hardware.
This week's xkcd cartoon states something very obvious - programmers are different. What doesn't ring true is that a non-programmer would have figured it out.
This week's xkcd cartoon reminds us of a time when the problem was clear and we fixed it - or did we? Even if we did. it is a well known law that commerce abhors a vacuum.
This week's xkcd cartoon mixes the abstract flowchart with real world things. If only we could figure out how to do this... oh wait, we have, it's called a computer.
With Apologies to Robert Frost/strong>
This week's xkcd cartoon reveals that programming really is behind everything and in this case we do mean everything, life, the universe.
This week's xkcd cartoon makes fun of our tendency to make simple things seem complicated. Making up a complicated jargon obfuscates a simple protocol, makes what we do seem more impressive, but also makes it harder. In case you are wondering - yes there are 86,400 seconds in a day without a leap second.
This week's xkcd cartoon shows the real nature of computing. To the uninitiated, i.e. most people, it looks like magic, even if we know it really isn't. It isn't. No really, it isn't...
This week's xkcd cartoon is a frightening portent of quantum computing to come. Perhaps the uncertainty principle really is at the core of computing and not just an excuse for knowing the cause of a bug, but not its location.
This week's xkcd cartoon reminds us that we might be looking in the wrong place. Data from Kepler now suggests that there might be as many as 40 billion earth like planets in our galaxy alone. So once again - where is everyone?
This week's xkcd cartoon is the reason everyone should learn to program - even just a little bit. Without it the complex plains of the computer savanna becomes a hunting ground for superstition and ways of working that have no basis in reality, our reality at least.
Learning to Cook
This week's xkcd cartoon makes it clear that being a programmer makes it worse when you fail at anything. Not only do you fail but the chances are that you have an algorithmic explanation of the fail.
This week's xkcd cartoon illustrates the one great characteristic of any programmer. Never solve the problem in hand. Always solve the general set of problems of the same type with the help of a good algorithm.
This week's xkcd cartoon points out an unsolved problem - users and file systems. If you are a programmer then a hierarchical file system should be as natural as recursion but.. for users? Well they never seem to know where their files are. This is the reason mobiles don't have user oriented file systems and we all know how that works out...
This week's xkcd cartoon may sound like its about wierd but doesn't it sound familiar somehow? Ah, the internet allowing us to get hot under to collar about nothing much...
Inside Random Numbers
CD Tray FightCD Tray Fight
Move Fast and Break Things
Candy Button Paper