All I want for Christmas is a negative leap second

I just want to see it. Just once. I want to watch that earthquake ripple through all of global electronic timekeeping. I want to see which organisations make it to January morning with nothing on fire.

Leap seconds bridge the gap between two clocks of wildly differing precision: caesium fountains and the spinning Earth in space. The time it takes for the Sun to return to the same spot in the sky, even smoothing out seasonal variations, is not precisely twenty-four hours, zero minutes and zero SI seconds. It varies. UT1 is derived from the spinning rock. If the (mean) Sun is directly overhead at the Greenwich Meridian, it's 12:00:00 by definition. This is social time. It's warm and fuzzy. It's the clock we actually care about. TAI is derived from atomic clocks which live inside buildings and don't care about where the Sun is or what. TAI is abstract and arbitrary and artificial and better. UT1 is running about 37 seconds behind than TAI right now and as time goes on and the Earth's rotation keeps slowing that gap is only going to get larger. UTC is a middle ground which ticks at precisely the same predictable rate as TAI — good for computers — but also stays synchronised to within 0.9 seconds of UT1 by occasionally inserting an SI leap second. This makes UTC good for civil timekeeping too. Ish.

-->

You know what a leap second is. The short version is that planet Earth is a terrible clock.

I love leap seconds. I love the unsolvable problem which birthed leap seconds, I love the technical challenge of implementing leap seconds, I love that they are weird and delightful and that they solve a problem, and I love that this solution is hugely irritating to a huge number of people who have more investment in and knowledge of time measurement than I do. It is a huge hassle to deal with leap seconds and I love that there is no universal agreement on how to deal with them. What should Unix time, for example, do during a leap second? Unix time is a simple number. There's no way to express 23:59:60. Should it stall for a second? Should it overrun for a second and then instantaneously backtrack and repeat time? Should it just blank out and return NaN? These days it seems like a popular choice is the Google-style 24-hour linear smear from noon to noon UTC. That is: a full day of slightly longer-than-normal "seconds". Should a clock do that? Is a clock allowed to do that? I love that. I think it's highly amusing.

But they're talking about giving up on leap seconds entirely by 2035. Rather than grapple with the organisational challenge and the engineering challenge, the suggestion is to just increase the allowable absolute gap between UT1 and UTC, from 0.9 seconds to something larger. And I'd be a little sad if we did that.

For one thing, it feels like a confession of defeat. Like, come on. Weren't you having fun? You've reached a situation where campaigning for the whole world to change the way it measures time is simpler than fixing your code?

Well, maybe it is simpler. I haven't seen your code.

For another thing, this is not something we can run away from. Sure, leap seconds flummox our software now, but they are at least relatively common. Up until the recent dry spell they were happening more frequently than leap years and over the longer term, as the Earth's rotation gradually slows, they're only going to become more frequent. The suggested alternate approach is to just let the gap between UTC and UT1 grow for 50 to 100 years until it's something on the order of 60 seconds, and then introduce an entire leap minute. Because there's nothing computer programmers handle better than special cases which only occur every hundred years or so. How in the world could this be an improvement? Leap seconds are announced six months in advance. They have been a fact of life for over fifty years. How much advance warning will the leap minute require, or receive? Five years? How much special one-off work will it take to handle? Is there a single system currently in existence which can handle a leap minute?

This just sets us all up for a mostly arbitrary Millennium Bug-scale operation. It's going to be unrewarding and supremely unpopular and everybody's going to hate it at least sixty times as much as they hate leap seconds now. This, I feel, is the wrong approach. If something is difficult, you do it more often.

Also, it's not lost on me that a solution of "Wait 50+ years and then do something" is equivalent to "Let's ignore the problem until we're all dead".

But finally and most importantly, abolishing leap seconds would permanently close the door on any possibility of a negative leap second.

Every leap second to date has been positive. The Earth spins a little slowly, just as a matter of course. In fact, there's a case to be made that the SI second was defined too short. And over the long term, its spin is only going to continue to slow. However, there are short-term fluctuations in the Earth's angular velocity, both increases and decreases. It's not fully understood what causes these, but since 2018 or so the Earth has been spinning faster than usual. The gap between UT1 and UTC, known as DUT1, hasn't been steadily trending negative. There hasn't been a need to introduce a positive leap second every year and a half or so. In fact, since 2020, DUT1 has been gradually trending upwards. If it continues to trend upwards in the same way for another decade or two, it is not completely impossible that the offset could reach a point where the IERS could credibly want to use a negative leap second to bring the offset back down towards 0 seconds.

Would that ever happen?

Under the current system, only full single SI seconds are inserted (or, potentially, removed), and only at the end of June and at the end of December. But before 1972 UTC and TAI were kept in much closer synchronisation, by adding or removing fractions of seconds, and by doing so far more frequently, at the end of any month. On two occasions, time has been removed:

(No, I'm not advocating returning to this state of affairs.)

So there is a smidgen of precedent. But it has been a long, long time. Negative leap seconds are possible, but they have never happened, and until recent years it seemed like they would never happen. And I feel extremely confident in saying that, short of an apocalyptic event, a negative leap minute could never, ever happen.

I feel that the IERS would be under tremendous pressure not to announce a negative leap second, because, well, look at how fiercely actual engineers are currently fighting positive leap seconds, which have occurred dozens of times over the past half-century, and are, relatively speaking, incredibly well-understood and extremely well-supported.

The intention is to keep DUT1 between -0.9s and +0.9s. Typical positive leap seconds in the past have occurred at DUT1 offsets between about -0.4 and -0.6 seconds — sometimes earlier, between -0.2 and -0.4 seconds, when the Earth's rotation was relatively slow, with the earliest at -0.2160228 seconds. If DUT1 were to trend upwards to +0.2160228s, I feel that the IERS would probably continue to wait. I wouldn't be surprised if they waited until +0.6s. I wouldn't be surprised it they were pressured to wait until +0.8s, or until the actual abolition kicks in, whenever that is eventually scheduled. It'll be pretty imminent, I'd imagine.

But I also don't know how much of that theoretical pressure the IERS would theoretically care about.

The point is:

No one, or almost no one, is ready for a negative leap second. It has always been a thing which can, in theory, happen. Up until the past few years, though, it looked impossible. And now it might, might just happen. And I need to see it.

Update, 2024-07-04

Dang.

Well, there's always next year.

Discussion (19)

2024-07-02 22:50:48 by qntm:

Love a bit of geodesy.

2024-07-03 01:49:19 by DSimon:

What should unix timestamps do during leap seconds? Absolutely nothing. The leap second should only matter when converting unix time to other calendar format. Same policy as leap years and DST. Or at least, that's what I would say if I could go back in time and change the POSIX definitions... And also, everyone would have to remember not to store schedules for distant future events as timestamps since they might not occur when predicted... Oi, time is hard :-(

2024-07-03 02:01:43 by Max:

Leap seconds are stupid. Nobody except astronomers cares about solar time. Just stop using them, if the difference ever becomes greater than 30 minutes, shift all timezones by 1 hour, which is something we do twice a year anyway, so most software supports it already. No special handling necessary.

2024-07-03 04:16:36 by ChrysAlice:

"Leap years are stupid. Nobody except astronomers cares about solar time. Just stop using them, if the difference ever becomes greater than 30 days, shift all calendars by 1 month, which is something we do twelve times a year anyway, so most software supports it already. No special handling necessary." I want to say that I'm not trying to be mean, but if you've had to deal with situations where the exact second or smaller something happened matters, it seems far less dumb.

2024-07-03 11:11:01 by qntm:

I care about astronomers!

2024-07-03 14:06:05 by Max:

I didn't say exact time doesn't matter, I said leap seconds and solar time don't. TAI already tells you the exact time something happened with maximum accuracy. Leap seconds intentionally make it *less* accurate!

2024-07-03 14:06:45 by Andrew:

I fully support ending leap seconds, and there's no need for a leap minute either. Countries intentionally set their civil time wrong by an entire hour or more, sometimes making the error depend on a complicated set of rules involving what time of year it is. So what's the point trying to get it right within a second or a minute? Just let DUT1 accumulate until it gets close enough to an hour that they decide to shift, and then make a change to the Olson DB (which will definitely still exist in 10,000 years). The effect is that the "UTC meridian" simply drifts west over time. People who care about exactly when something happens keep counting SI seconds, just like they do today, and the astronomers keep consulting IERS for the difference between UTC and UT1, just like they do today. The only thing that changes is that the difference between UTC and TAI stays 37 seconds forever.

2024-07-03 14:10:35 by Max:

qntm: admit it, you only wrote this article because of the URL ;)

2024-07-03 18:22:21 by Kevin:

The decision to define computer time (in the Olson database, POSIX, etc.) as "UTC + offset" was a mistake. It should have been defined as "TAI + offset," with UTC treated as "just" another time zone. TAI was explicitly designed to have the nice monotonic properties that computers want to have, and all the existing time zone machinery is entirely capable of handling irregular (not a multiple of 1 hour) offsets (because there are several current and historical time zones with irregular offsets). It's mostly too late to change now (short of abolishing leap seconds altogether), but for some time, the Olson database has had a separate subdirectory called zoneinfo-leaps (previously "right") which assumes that your system time includes leap seconds in violation of POSIX, and provides a suitable set of offsets that adjust for those leap seconds (including a UTC time zone with nonzero offset). In principle, it is then possible to have a system that works as I described above. Obviously, there are problems with this approach, mostly of the form "software wrongly assumes that your clock is on UTC and fails to apply the leap second offset," but there are also administrative issues, like "how are you going to find an NTP pool that is compatible with this?" and "you know you're still going to have to speak UTC to every other system on the planet, and that's still going to have leap second infelicities, right?" As for negative leap seconds? I'm not convinced this would be anywhere near as big a deal as positive leap seconds. Positive leap seconds cause Unix time to (appear to) go backwards, which is "never" supposed to happen. Negative leap seconds would cause Unix time to jump forwards... but anything that interacts with the network already has to deal with random unpredictable latency spikes, so for distributed systems, this would be a mostly unremarkable event. You would probably see some requests time out or fail and have to be retried, I guess, but a robust distributed system should already be aware of this failure mode and should have recovery code for it anyway. I wouldn't want to be the SRE stuck holding the pager on that particular day (most distributed systems get upset and notify a human when they see 100% of requests suddenly fail all at once), but I think the average consumer would not even notice.

2024-07-03 18:56:36 by Solis:

"You've reached a situation where campaigning for the whole world to change the way it measures time is simpler than fixing your code?" For the benefit of navigation, we changed the way mariners measured time (and it's a cool story!). Then, for the convenience of rail-way operators, we changed the way the whole world measured time. We changed it again, in a significant portion of the world, ostensibly to save on energy—but maybe just because William Willett was annoyed by dusk interfering with golf games. One can make a legitimate case that measurement systems should serve humanity, rather than the other way around, at least till we meet non-human sophonts. I'm in Ottawa, Canada, where Stellarium says noon was at 13:07:11 today (find and select the sun, and press F10 to bring up "astronomical calculations"; then go to the "RTS" tab, press "calculate", and read the "transit" column). It'll have its latest value of 13:09:20 in a few weeks, and its earliest of 11:46:20 in early November. The variance of solar noon across China is pretty wild by comparison. So the premise that our quasi-annual adjustment of one second is to make clock-time substantially match solar-time seems somewhat farcical. It'd be millennia before we could gradually accumulate enough error to match what was done instantly by legislative fiat—and over such time-scales, people would just get used to the sun rising at 17:00 or whenever. That said, I do share your sense of curiosity: I always love reading reports on the myriad and fascinating ways in which systems have failed (but less so when I've inherited maintenance of such systems or am otherwise affected by them).

2024-07-03 19:14:03 by Nim:

I agree that leap seconds are awesome, and Im a big fan. I can also see that they pose a problem for civil time. The proposed solution, however, is fundamentally wrong. If using a time scale that is somewhat irregular poses a problem, use one that is regular, such as TAI. Re-defining an existing time scale (UTC) is a form of scientific betrayal. It makes me wanna spit and puke. It really does.

2024-07-03 23:17:44 by qntm:

> I think the average consumer would not even notice. Well let's be honest, the average consumer has never noticed a positive leap second either. But that's not who this is about, really.

2024-07-03 23:49:55 by Solis:

The proposed solution of having leap minutes or hours is widely considered something of a trojan horse. The metrologists advocating for it probably don't want any such "leaping", but they're representing governments that insist on some link between civil and solar time. So they say, okay, we'll adjust the clock every hundred or thousand years; till then, the error will be no worse than being off-centre in a time zone. By the time there's a problem to be solved, they'll be long-dead, nobody will care much about the drift, and the whole thing will be called off. Of course, some effort will have been wasted on strict standards-compliance and testing for this event that will never happen. Nim, other than the dishonesty of it, do you see a specific problem? Metrologists re-define units all the time. The second was re-defined in 1967; the kilogram in 2019 (though not with that awesome-looking hand-polished silicon sphere). UTC itself has been re-defined several times already. A lot of effort goes into making sure that such changes are seamless—whereas switching everything over to TAI, with an immediate 37-second clock jump, would not be. Also, TAI is not entirely "regular" anyway, being based on a retrospective average of hundreds of clocks. DSimon, your message is self-contradictory. You say Unix timestamps should do "Absolutely nothing" during a leap second. But stopping the clock for that second would make the leap second matter quite a lot when not converting Unix time to another calendar format. For example, some program would measure an elapsed time of precisely zero, and then crash because it divided by zero. Maybe all video playback on earth would pause for that second.

2024-07-04 05:38:20 by Quilnux:

You know, it's going to be very interesting when software has to be capable of supporting multi-planet time. Mars may not have the same time as Earth for example. It'll be a similar issue to leap seconds, minutes, hours, etc. I would expect.

2024-07-04 06:42:46 by Wei:

I work in climate modelling, and while I don't think that perspective is really that important for deciding on modern time standards, I do think it is amusing. As I understand it, our model always assumes 86400 seconds per day, but does support both Gregorian and Julian calendars, as well as a third "No-leap" calendar for simulations where it's acceptable to treat the Earth as not not needing leap years because years are exactly 365 days long. In the rare case where someone wants to simulate, say, the Cretaceous climate, it's usually assumed that the difference in day length is much less significant than our other uncertainties about climate forcings back then. But it's also a perspective from which it's easy to sneer at other applications. "What, you thought the Earth was stable and simple and rigidly periodic? Let me tell you about our ridiculous planet..."

2024-07-04 15:30:11 by DSimon:

Solis, by "absolutely nothing", I meant that it shouldn't do anything *special*, not that it should literally freeze. That is to say, in my ideal world, UNIX timestamps would literally just be the interval in SI seconds since the epoch, calendars and leaps be damned. I don't think this would actually be a good idea in a world with leap seconds, because timestamps also need to represent future events. But it's satisfying to think about.

2024-07-04 18:32:28 by Solis:

DSimon, that would make sense; and, even in a world with leap seconds, I think it should have been the behaviour. Are "future events" really a problem? Programmatic events should mostly be using absolute CLOCK_MONOTONIC times; POSIX-2024 will finally add pthread_mutex_clocklock(). Future events for calendars and such need to operate with "user-friendly" time scales, which mostly preclude UTC and TAI. If I schedule a party for 21:00, many months in the future, and my government decides to change daylight-saving time before then, or change the time zone—which some have done with less than 2 weeks' notice—the calendar entry probably still needs to occur at 21:00 local time. Unless I'm an astronomer, of course, but I probably still wouldn't want leap-second adjustments in that case: wouldn't it suck if my astronomical photograph were off by a second because a leap second occurred after I scheduled it? Linux has CLOCK_TAI, which I think mostly works as you and I want, provided the system is running chrony or ntpd (which use ntp_adjtime() with MOD_TAI to set the system's UTC-to-TAI offset). Too bad it can't be used for file modification times and such. And if I recall correctly, it returns UTC if the system offset hasn't been set, so one should probably do a sanity-check before treating its value as actually being TAI.

2024-07-16 20:36:48 by FormerlyFormal:

Leap seconds bring me joy in the same way that historical time anomalies do. Every one always has a lovely story about it. The canonical anomaly is the switch from the Julian to the Gregorian calendar. Every country that existed at the time went through it. But one country stands alone. “Sweden's transition to the Gregorian calendar was difficult and protracted” Behind this innocuous line lies a truly evil beautiful story of horrific mismanagement “ In November 1699, the Government of Sweden decided that, rather than adopt the Gregorian calendar outright, it would gradually approach it over a 40-year period.” Hell of a start. With a brief like that what could possibly go wrong? https://en.m.wikipedia.org/wiki/Swedish_calendar

2024-07-26 07:59:16 by Sundiata:

didn't know the leap second itself was going to be retired by next decade.

New comment by :

Plain text only. Line breaks become <br/>
The square root of minus one: