Persistent programming used to be the only way to do it. Today it's almost a forgotten approach championed by enthusiasts and cranks but it has a lot going for it.
It pays to be persistent
Ideas in computing have a habit of not doing quite what you expect. When someone claims that something is about to change everything it usually doesn't. Then when you are not paying much attention some other idea crashes onto the scene for no particular reason and changes everything. What, you might ask, has triggered this deep reflection?
The answer is hibernation. It's a neat trick and I'm not talking groundhogs or polar bears but computers. Even the dimmest user values the fact that they can hibernate their computer. Instead of turning the machine off and starting up again the next time with a completely clean and reset desktop environment, hibernation allows you to pick up from where you left off.
Most users have a range of applications and windows open that they refer to often and having to relaunch everything each time the machine starts is simply a pain that can be avoided. Of course there are disadvantages to this style of working. In particular preserving the machine's state can cause problems to slowly accumulate until instability demands a complete reset. However, most users seem to like to work this way and it has to be admitted that it seems natural to pick up where you left off.
Of course in the days when machines had magnetic core memory then persistence was the norm. Any program that you loaded would remain in memory while the machine was turned off. Magnetic memory, being magnetic, didn't need power to keep it working. So when you turned a machine on in the morning it was more or less in the state you switched if off in the night before. Persistence was built into the hardware.
As magnetic core memory gave way to volatile static and dynamic solid state memory then volatility became the standard mode of operation and the machine became a blank slate each time it was switched off and back on.
What has all this got to do with programming?
The simple answer is that programming environments have almost universally become volatile but once they were as persistent and core memory. If you have ever programmed in languages such as Logo or Smalltalk then you might have encountered a persistent environment. What happens is that you work in an IDE that accumulates code as you type it in and those chunks of code become part of the environment. That is, if you create a new function, for example Squared, then you can immediately use the function by typing its name. There is a distinction between deferred mode and immediate mode commands that allows you to enter code that will be stored rather than acted upon but the whole system accumulates what you do and grows as you work.
Most accounts of languages that have persistence tend to focus on their other more radical features - they are functional or dynamic - but persistence is just as radical an approach. Because we tend not to discuss it we also tend to miss the fact that this is yet another one of the big splits in the approach to language design. Languages can be persistent and persistent programming is a methodology - albeit one that is currently keeping a low profile.
Not functional programming
Notice that persistent programming isn't the same as functional programming. In functional programming you have immutable data - once a variable is set to a value it can never change its value. This is immutability and not persistence. In a persistent programming language data or more generally objects exist for as long as they want to or need to without having to invoke additional storage mechanisms like backing store or a database. Objects simply exist and they survive events such as the development environment being closed or the machine being shut down and restarted... well they do in theory.
Don't confuse persistent programming with object persistence frameworks like Hibernate. Such frameworks could provide a persistent environment but at the moment they don't. What they provide is an easy way of saving objects in a database so that they can be retrieved. At this point we could also get involved and confused by object-oriented databases in general and even more confused by object relational mapping. However all of these mechanisms come about and are necessary because true persistence is lacking in our development and runtime environments. Indeed the fact that we distinguish between development and runtime environments is a clear sign that we haven't embraced persistence.
In a persistent environment the distinction between runtime and design time are irrelevant. We simply create and use objects in an interactive environment. It's all run time and it's all design time. There is no need to think about database or indeed files because everything is persistent. When you create an instance of an object representing a customer it stick around - forever! Well it persists for as long as you want it to. Even persistent environments need a delete or destroy command!
If you are thinking that persistence is simply inefficient and we need the distinction between primary and secondary storage then consider the fact that primary storage is only volatile because of a hardware accident. No programmer ever said "I need main primary memory to be erased every time the machine is switched off". Similarly notice how much users when given the choice prefer hibernation to plain vanilla switching off.
When you first work with a persistent environment there is a surge of panic when you first realise that everything you create hangs around. You start to worry that some how you will accumulate so much clutter and noise that you just wont be able to create a clean program that actually works - and yes you do start to long for a reset button that allows you to start a clean project. But just like your machine's desktop you slowly begin to feel at home and if any malfunction actually destroys your persistent environment then it feels as if you have been reset - personally.
Persistent programming is the forgotten paradigm, sidelined mostly because the hardware conditions the way that we think - it doesn't deserve it.