Page 1 of 3
Date and times follow their own regularities, and they have nothing at all to do with binary or even simple decimal counting. First, clock and watch makers had to find ways of working with hours, minutes, seconds; and then programmers had to find ways that were much simpler. Join us on a quick tour of the time and date system and how it can be mastered using the mod function.
Computers and programmers like regularity - it’s the exceptions that cause problems.
What could be more regular than our calendar and time-keeping methods?
After all, they are designed to provide a regular measurement of the passage of time. The complication is that we use a human-oriented system of measurement designed to tie in with the seasons and astronomical phenomena.
Perhaps in these days of bright streetlights and a general stay-inside lifestyle perhaps it would be easier to divide the year up into 10 months each of 10, 10-hour days! Perhaps the minute should have 100 new seconds and there should be 100 minutes in a new hour!
Don;t laugh just yet - there have been many serious proposals to decimalise time. It was even tried out during the French revolution but never caught on.
Then again perhaps even this isn't radical enough and we should throw it all away and start again with something like the decimal-point-based “star date” of Star Trek and other science fiction. After all, it wouldn't be the first time that we had changed to other units of measurements to make it easier for computers to deal with them.
However, the problem with date and time goes much deeper than mere units. The problem lies in the very nature of the mechanics of the regularities that the system of measurement seeks to capture. So it looks as if computers have to learn to work with the existing system and, no matter how attractive decimal time sounds, it just isn’t going to happen.
Making a mod of it
The whole date and time thing starts off with two simple regularities of life.
The first is that the earth rotates about its own axis once a day – or more accurately we call the time it takes for the earth to rotate a day. The second is that the earth rotates around the Sun to define a period we call a year.
You could say that the day is all about time and the relationship between the day and the year is all about date. So starting off with time…
For reasons that have nothing to do with anything other than custom and practice we divide the day into 24 hours and an hour into 60 minutes and a minute into 60 seconds.
The only advantage of these strange numbers is that they are highly factorable, i.e. you can divide by 2, 4, 8, 12 and so on and get answers that don’t have fractional parts. As already mentioned In principle we could decimalise time measurement at this level to have 10 hours in a day, 100 minutes in an hour and so on but I doubt anything will ever come of this idea - mostly because 10 isn' very factorable.
In practice working with time is fairly easy as long as you always convert to the smallest unit. Indeed this is the basic principle of time and data arithmetic in all computer systems.
For example, if you want the difference between 3 hours 2 minutes 1 second and 1 hour 5 minutes 10 seconds simply convert the whole lot to seconds and take the difference. What ever you do don't attempt to emulate the way humans compute with times by subtracting the seconds and minutes as separate operations with carries in different bases.
Converting to seconds is just a matter of multiplication by 60
answer = 3*60*60+2*60+1-(1*60*60+5*60+10)
Of course you now have the problem of converting back to hours, minutes and seconds but this is where the Mod function comes in.
The basic idea is that the Mod function gives you the remainder when you divide by something. For example Mod(7,3) or 7 % 3 is 1 because when you divide 7 by 3 it goes twice with remainder 1.
The only other tool you need is the Int function (variously called trunc, floor etc. in other languages), which simply chops off any fractional part a number might have.
To convert T in seconds into hours, minutes and seconds you use the following:
seconds = Mod(T,60)
minutes = Mod(Int(T/60),60)
hours = Int(T/(60*60))
The Persistence of Memory by Salvador Dali
Having to convert to the smallest unit of time and then back to hours, minutes and seconds is tedious and error prone.
As a result most spreadsheets and programming languages usually provide a date/time facility, which works in a slightly different way.
In this case the time is recorded in days and fractional days. That is 1.5 days is the same as 36 hours - 24 hours (i.e. one day) plus 12 hours (i.e. half a day). As long as the fraction is recorded to a sufficient precision this works equally well as working in seconds.
The only real problem is that one hour is 0.04166 recurring but you can generally avoid having to type in such numbers by using 1/24 for on hour, 1/(24*60) for a minute and 1/(24*60*60) for a second.
The big advantage of working in days and fractions of a day is that you can calculate time differences without having to do any conversions – you simply use subtraction. If you have two dates D1 and D2 in day, fractional day format then the time between then is just D1-D2.
Of course if you want to show the time in hours, minutes and seconds you do need to convert but it’s easy using the same techniques to convert seconds:
days = Int(T)
hours = Int((T-days)*24)
minutes = Int((T-days-hours/24)*60)
Fortunately you generally don’t have to do this because most programming languages and spreadsheets support a date/time data type which uses fractional days and provides automatic support for conversion.
For example, in Excel the Time(hours,minutes,seconds) function converts a time to fractional day, e.g. Time(12,0,0) is .5.
In most cases displaying a fractional day value as a time is just a matter of setting the correct format on the cell – and this usually happens automatically anyway.
Other languages have similar functions and facilities.