When working with two screens I don't sit them evenly on the desk. Place one straight in front and the second monitor to the side. Use the main monitor for whatever is your main activity and the second for what you are observing - usually Visual Studio, or another IDE, on the main and the program on the second.
Sometimes you have three classifiable activities on the go - debugger, source code and app and yes in this case three monitors are best. And once again you don't use them evenly - the middle monitor is where your concentration is while you view the secondary activities on the secondary monitors. It works naturally.
To be honest I tend to prefer three monitors even when I only have two proper tasks because then the third monitor shows my email and various status screens.
Of course you can go in for swapping windows from the background on any of the monitors but again it works best if you can categorize the windows on each monitor.
If you talk to general users about multiple monitor setups then take what they say with care because it really does depend on the number of categories of sub-task your main task decomposes into. If this number matches or is close to the number of monitors you have then it's simple. If the number is much lower, e.g. one, or if it is much higher, e.g. freely browsing the web, then multiple monitors aren't going to make much difference to your efficiency nor your experience.
The magic happens when you need to look and work with n types of window and you have n monitors.
Put like this it doesn't seem like rocket science!
In addition you can run virtual machines and place each desktop in its own monitor. Similarly, if you use real multiple machines you can use one remote desktop on each monitor and the advantage is you can copy-and-paste and generally work together.
Yes, you could save a few dollars and use multiple virtual desktops with the same sort of organisational scheme, but the switching between them takes the material you are attending to out of your vision. Keeping it in your vision, even if it's only your peripheral vision, keeps it fresh in your mind.
Multi-monitor in practice
So want to give it a go. How?
If you look at the back of your current machine and it has a DVI socket and an HDMI (or VGA if its a bit older) then it probably supports two monitors already. To generalize the same is true if there are two video connectors of any sort on the back of your PC. You simply need an additional cable fo the correct sort and, of course, an additional monitor.
In principle when you plug in the additional monitor it will be recognised and you will have the option of configuring it to provide and extended desktop.
For some reason I have yet to understand, PC and graphics card manufacturers don't ever seem to state that their creations can drive two or more monitors. Perhaps they think we take it for granted or perhaps they are too busy telling us about their wonderful GPU performance.
So what do you do if you look at the back of your current machine and discover that it has one lonely video connector?
The best solution is to get another PC - after all a change to something faster is the best efficiency improvement you can make.
If you reject this option then a second graphics card is your second route to multi-monitor. In this case you could install a low-cost card with a single video socket and use it with the existing video hardware to provide two outputs. A better solution is to buy an almost as cheap - and easier to find - more advanced card that has two video connectors and simply disable or remove the existing graphics hardware. This has the advantage of simplicity.
If you don't want to buy new graphics hardware then you could invest in a USB to VGA/DVI/HDMI adaptor. When I first encountered the idea of USB to video it sounded like something to avoid. Now after many months of using such a device to provide two and three monitor systems I no longer avoid it. It still isn't my first choice but it has its uses.
You can plug the USB device into a machine and install the drivers in a few minutes. Reboot and you have a multi-monitor system that is quite capable of showing an app running on say the main screen and a debug or code window in the USB generated second screen. I have never tried to use the USB video for high performance games - why would I? But I can report that in general use I have yet to detect any difference between the monitors in a mixed setup.
The USB-video approach has two particular uses that make it work keeping a spare on the shelf. The first is that you can use it to create a three-monitor system as and when required as follows:
plug the USB in
steal monitor from another desk
The same technique has also worked when a temporary two-monitor system is required.
What about the monitors themselves?
It is very nice to have a matched pair of monitors but mainly this is because it looks good and neat and tidy. If you have two monitors of different sizes it can be a slight shock if windows change their size/resolution as you drag between monitors but it's not a huge problem as you will mostly be using windows fixed on their respective monitor. One issue is monitor height. Many low cost wide screen monitors have very small pixel heights - 768 or 900 pixels. While you can argue that with two monitors the size of the individual monitors doesn't matter so much - height does. A wide screen monitor may be fine for watching movies but working with web pages, debugging windows or apps the result can be vertical scrollbars which can take the edge of any efficiency gains.
Last year at Google I/O the one of the most interesting announcements was the idea of an instant app. A sort of crossover between a native app and a web app. Now they are almost here with the start of [ ... ]