|The Magic Number Seven And The Art Of Programming|
|Written by Sue Gee|
|Thursday, 19 March 2020|
Page 2 of 2
The Seven Of Interfaces
The magic number seven has had its influence on programming methods but it has had less influence on the user interface where it is just as important.
If you have ever watched a TV program or movie that uses the multi-shot technique - multiple small panels showing different action at the same time - you might have admired the technique but why aren't all such programs made like this?
The answer it that it takes a lot of processing power to cope with a multi-screen bombardment - and sadly the vast majority of people just can't cope. We run out of bandwidth well before the theoretical seven strands of narrative.
No, the average viewer likes a nice simple, single-thread storyline - with one image following another preferably in a correct time sequence.
Anything else is for novelty value or something really creative.
If you agree with this then think for a moment what this means for interfaces based on hypertext, multi-media and GUIs in general.
A typical web page is crammed with links and is a clear example of where the human processing channel is threatened with imminent swamping by excess information.
In the case of followable links the swamping is a drawn-out sort of affair as the user follows multiple threads of information and forgets what the original enquiry was all about.
In the case of multi-media the swamping can be immediate; more like a tidal wave of information. To watch on-screen video and try to read text and listen to the audio tracks can be promising but it can also send the innocent who actually try to cope with the information into terminal overload.
Then there is the question of the rich GUI interfaces we can create so easily using any of the many technologies that give us far more than buttons. We tend to create interfaces that have the "gee whizz" effect and we can't help but be impressed by the clever implementations using any of the latest techniques - overload is now the norm.
In fact as programmers we suffer from being too close to the interface. Our unit of comprehension is too large and we manage to cope by being immersed in the system so that we understand it in much larger grouping.
There are two ways in which acknowledging the human information processing limit can help us produce better interfaces.
The first is to recognise that, no matter how complicated the display the focus of attention will rarely manage to cope with more than one or two things at any moment.
In the same way you shouldn't nest control structures too deeply in a function, you shouldn't nest dialog boxes, menus or windows too deeply. It is better to think in terms of a single focus of attention within your application. If the user opens multiple applications and so splits the focus, well that's up to them - perhaps they can cope.
The rule of seven is also an argument for skeuomorphism. Making things in an interface look like things in the real world reduces the novelty of the interface and so acts to make it easier of the innocent user to take in and process. Notice that much of this depends on the user being "innocent" and to a great extent "fresh". If you are familiar with a website or a UI then basically you see it as simple no matter how complex it really is. There is also a huge pay off in copying other people's designs so as to get the familiarity factor for free.
Overload and bloat
If you want to see an example of a program that breaks these rules then look no further then the standard Windows file manager - File Explorer. It has at least three menus structures on show - the two horizontal menus, and the task panel to the left. If you throw in the right-click context menu and any number of optional extras then you begin to realise that the user is overloaded. What menu to use for any particular task is a mystery.
Most of the existing Windows interface is like this. It has accreted alternative ways of doing things as we have succumbed to the need to add the latest interface widget or to simply improve by adding rather than improve by making better.When you throw in the need for backward compatibility you can appreciate that interfaces naturally grow from simplicity to overload.
You can apply the same accretion of complexity argument to just about everything in programming. It happens in language design
It happens in application programming as versions gain new features and still retain the old. It happens in operating systems as they struggle to remain compelling at each upgrade while striving to still run the old stuff.
At every stage software adds complexity and outgrows the human channel capacity in every possible way.
This is the reason why as programmer we eventually get an overwhelming desire to throw away what we have and start over.
The idea of a human as an information channel may have gone out of fashion in psychology but it is the only relevant model for software design.
or email your comment to: firstname.lastname@example.org
|Last Updated ( Thursday, 19 March 2020 )|