Down with the leap second


Recently there was a big fuss (including denouncements from many I know) over a U.S. effort to do away with the leap second. People claimed this was like trying to legislate PI to be 3.I am amazed at the leap the the defense of the leap second. I would be glad to see it go. All our computers keep track of time internally as a number of seconds since some epoch, typically Jan 1 1970 or 1980. They go through various contortions to turn that absolute time into the local time. This includes knowing all the leap-day calculations and the leap day calculations. It's complicated by knowing that sometimes the day is Feb 29, and by knowing that a very, very few minutes have 61 seconds in them (or if you prefer, that a very few hours have 3601 seconds and rare days have 86401 seconds.)

That's a mess. A minute should always have 60 seconds. Special casing all time code to deal with this was the wrong approach, and as noted, is subject to errors because the code is very rarely tested in that state.

I'm astounded to see people saying this is the same as declaring pi to be 3. It's having 86400 seconds in most days and rare leap seconds that is the integerization of a real number. The truly scientific approach would be to declare the day to be 86400.002 seconds, and lengthen that number over the centuries, would it not?

Astronomers, like computers, can and should keep track of time as an absolute number of seconds since some epoch. They actually care very little about what the local time is other than to know when it's dark, something leap seconds have insignificant bearing on. Indeed, astronomers might be happiest using siderial time (where a day is 23 hours 56 minutes and +4.1 seconds, the true rotational period of the Earth.)

Our system of time is not one scientists would pick in the first place. It is clearly designed for the convenience of the ordinary people, and the legacy of the traditional means of telling time. It's silly to use this legacy system and at the same demand the general public and its timekeeping systems jump through error-prone hoops to make it reflect noon correctly to the second. Nobody even uses local time anyway, they all use a time zone. The time zone is off by a huge margin from local time, why does it matter in the slightest if it's off by a few more seconds?

In many centuries, the drift will be noticeable. If we still care about local time, we can fix it then.


It somehow seems backward to force the real world to conform to lazy programmers, rather than the other way around.

This is largely a philosophical point of view, since the practical difference in time of not having a leap-second is so slight.

HOWEVER.....given that very few programmers actually write time routines (most grab the time from a library call), and that computer-controlled telescopes are now a high-end consumer item, it may be that more people would be inconvenienced by the lack of a leap second than would be helped.

The problem is leap seconds are announced by a committee. There is no schedule of what years will have leap seconds going into the future. This means your time software would have to get updates of which years have them.

I haven't checked them all but I suspect some time software just kludges it and doesn't account for the leap seconds, and the NTP daemon slowly corrects the clock to UTC.

It's a tough situation. The above solution (slow correction by skewing the clock) works fine for most people, but would be death for those who truly need accurate timekeeping -- like astronomers. Having the software get live updates is messy and requires tables for all of history to maintain a true seconds clock. It also means that your date software, when asked, what second is it, will on that magic day return "60" -- when most programs are expecting a number from 0 tl 59. It's been known to break stuff.

Leap seconds are generally only inserted (or removed, as required, at least in theory) in the last minute of the last hour on the last day of December or of June (although any month is possible), and of course are there to align our clocks with astronomical time.

Astronomers are very interested in real and accurate time, so that they know where they're looking for things in the sky at any instant in time. What reference do we have for stars, if we can't relate their location to a particular angle from a location at a particular time of day on any specific day? And seconds from an epoch doesn't cut it. You simply can't say that a celestial event (e.g. a particular star passing a particular spot in the sky) occurred at 4:21:11.385 one year, but at 4:21:16.278 another year: we can't correct to the ultimate accuracy, but we can adjust our clocks to within a few milliseconds of correctness. They would have to make the leap-second adjustment anyway. And those with more specialized needs have the capability to correct even more accurately.

If we wanted to define time to please us geeks, surely we'd do better than to have 86,400 seconds in a day, or to define a second as so many oscillations of a cesium atom.

And it's not like an accurate clock is that difficult, relative to other solved problems. If leap seconds are causing grief, there's no hope for correct handling of daylight savings time. As for keeping track of leap seconds, there is room for a network service in concert with NTP serving up a list of all leap-seconds to date. And if a software designer doesn't understand that some minutes have 60 seconds, I would guess they have other serious bugs in their software too.

Quote: The truly scientific approach would be to declare the day to be 86400.002 seconds, and lengthen that number over the centuries, would it not?


The length of an earth day is not stable, nor is the increase stable. The most important fact is that it is not predictable. Would that it were. If your universal day standard was adopted, there would still be a need to add or subtract leap seconds over the years as the earth matched or missed the estimate.

The tiny tiny change in the length of an Earth's day varies enough to matter (to some people) each day. The nuclear missiles the ballistic missile submarines fire use a star correction in flight to update their targeting. THey need to know precisely where that star is in order to determine precisely where they are in reference to their programmed flight, to calculate where their intended target is and what they need to do to get there.

There are observatories that measure the exact length of a day, and provide a correction value for the UT, called UT Correction, or UTC. It's entered in the computers for the missiles twice a day. UT plus (or minus) UTC equals real-real UT.

The purpose of implementing a leap second to the UT is to reduce the size of the UTC. The smaller the error, the more accurate the correction will be.

As we need a dynamic correction it makes the most sense only for those who truly need these corrections to track them, and not to make all the clock users (ie. the rest of the world) who don't need them have more complex, dynamically updatable clocks. Astronomers and missiles should use a correction in any event (and in most cases, really for siderial time, not solar time.) A working leap second system requires either immense amounts of manual correction, or an ability to securely send live correction signals to all the clocks of the world that want to be accurate.

Add new comment