It's frustrating that programmers want to redefine civil time just because it is "hard". This article glosses over the real world problems that detaching from UTC will cause.
(You may want to scroll down to "Implementing the plan outlined at Torino".)
If we end leap seconds, it doesn't take long - only until 2028 - until "midnight" is sufficiently far from "the middle of the night" that you will have to consider the legal issues caused by events that happen just before or after 0000 hours.
By 2055, the "minute" displayed on a clock may be incorrect, which again may cause issues with legal timestamps.
And by 2083, sundials are measurably wrong.
All because programmers wanted to save some lines of code.
> It's frustrating that programmers want to redefine civil time just because it is "hard". This article glosses over the real world problems that detaching from UTC will cause.
I agree, but I'm also - sad to say - less than surprised to find engineers at a Big Tech firm taking a high-handed, not to mention narrow and ill-informed, approach over the issue and trying to impose their will on a global scale. My worry here is that, Meta being Meta, they carry quite a lot of influence and may actually gain some traction.
EDIT: I'll add a bit more colour here. At the core of our platform we manage a database containing billions of legacy timestamped records (or events, if you prefer), adding more and more every day. Without even giving it a great deal of thought I guarantee you that this proposal will cause us more problems that it solves and will distract us from making more valuable investments of time and effort that would benefit our business should it be implemented. Sure, we can no doubt fix all these problems, but we've got better things to do. I imagine that many other businesses would be similarly affected and would take a similar view.
I'm kinda impressed by the hubris, really. Usually it's emperors, kings, and big multinational governing bodies that try to screw around with the time standard that ordinary people have to live with. Occasionally strident revolutionaries who've already solved the "overthrow and replace the government" part of their problem and aren't content with just beheading people all day.
Says something about how Facebook sees itself, I guess.
Lot's of folks care, what are you talking about? Accountants and lawyers the world over EXTREMELY care about keeping the computers idea of wall-clock time and your idea of time in sync, and if you're a customer faced with the side-effects of changing the standard after-the-fact, you probably care as well.
Let's paint a picture based on actual code I've actually seen in the real world. If you ignore the leap second but keep using UTC, then in about 5 years, UTC will differ from wall clock by about 5 seconds. So if, in some software used for, I don't know maybe billing customers, someone was calculating day boundaries by doing modulo division of UTC by the number of seconds in a day (I've seen it), then in 5 years we've got a 5 second discrepancy in the number of API calls made by customer X when comparing what the software says to what the customer measured. Customers don't like this, accountants and lawyers REALLY don't like this, and us engineers will have the wonderful experience of telling them all
> "this code used to be valid until some boneheaded engineers at Facebook convinced a ton of other engineers to break the agreed upon standard about what it means to measure time in this way, and now things that used to work fine need to be patched because we've got a Y2K EVERY DAY!"
Oops, I guess ignoring wall clock time might be something other human people care about after all.
> I totally don't care about the Earth slowing down.
Neither do I in day to day life. But I do have to care about it when I or members of my team write code, or store and retrieve data to and from a database, or work across multiple timezones, because it can be critically important to unambiguously know whether something happened on one day or the next.
The reality is there aren't any nice, elegant solutions to this problem. Leap seconds aren't a nice solution. Meta's proposal isn't a nice solution. I don't necessarily even think it's worse than leap seconds, but it's certainly not substantially better. The key point is it's a change and one which, in my view, won't deliver enough value for everybody (beyond just Meta) to justify the level of disruption it will certainly cause if implemented.
>Your honor the nuclear attack on San Fransisco happened at 10.59.59 as per UTC-Facebook time and is as such part of WWIII and not a violation of the armistice.
By "UTC-Facebook" time, you of course mean UTC time, the time everybody already uses, and that has no need to be broken every year, two years, or three years, and wouldn't be broken at all if we simply stopped breaking it.
By a foot you of course mean the foot-meter, a measure good enough for everyone and one which will stop breaking metric conversions if we just defined three feet to be a meter.
That’s all it is, they ran into an engineering problem and they’re trying to get the world to bend to their will instead of solving the problem because they think it will be easier. Mark’s arrogance is nauseating.
We have three alternative time systems and a big bag of issues with each of them, but you think the extremely mundane argument that we should prefer one bag represents nauseating arrogance because you think that your favorite bag -- a different one -- is obviously correct? Come on. Do better. Be civil.
FB is not making the mundane argument that we should pick one time system over another. They are literally proposing that the world should redefine UTC to be TAI with a permanent fixed offset, which is functionally equivalent to just using TAI.
That is effectively proposing the deletion of the most commonly used time system of the three primary time systems from existence and forcing everybody and all existing systems that use it to convert to what is effectively TAI.
That is not mundane. Mundane is arguing that everybody should use TAI. Arrogant is arguing that we should force everyone to do it by redefining their dependencies under them.
No, it is a change that breaks everybody using UTC correctly, TAI with a offset to synchronize with UT1, in order to fix everybody who did not know what they were doing and used UTC when they actually wanted TAI.
If there was a scheme that fixed only the wrong usages, that would be fine. But, it is frankly absurd that we should even consider breaking carefully designed programs correctly using their dependencies to fix programs incorrectly using their dependencies especially when it is trivial for the wrong usages to be fixed manually.
No, UTC is TAI kept in sync with UT1. Changing UTC to being TAI with a offset is a fundamental breaking change in what it means. Anybody relying on UTC doing what it is designed and advertised to do, keep in sync with UT1, will be broken. The only people who will not be broken are people using UTC incorrectly as TAI. The only reason this is interesting is that basically everybody uses UTC incorrectly as TAI, but that is not a valid excuse to break the programs using it correctly.
People using the wrong dependency should fix their system to use the right dependency. They should not campaign to steal the name and replace it, that is absurd.
Literally nobody depends on any relationship between UTC and overhead sun angle.
The only people who care or need to do not use UTC. They use TAI, and a separate continuous log of fractional seconds.
UTC has one role, and that is Standard worldwide civil time. Telling people who need Standard civil time to use TAI makes everything strictly worse: not only do you then not match most of the world, but you still have to track irregular, unpredictable corrections to be able to sync with everybody else.
Except that standard civil time cares about the overhead sun angle for some reason, that is why we use the day demarcations of UT1 instead of TAI. If we really decided as a society that we really no longer care, then we should switch standard civil time to TAI and do away with UTC entirely, not calcify it as some arbitrary offset from TAI.
> "cares about the overhead sun angle for some reason"
That is what is proposed to be fixed and that you are arguing against for reasons you don't know or, apparently, care about.
Switching civil time to TAI would break everything, most of which cannot be fixed. Random breakage is the problem. More breakage would be strictly worse.
Well we could introduce negative leap seconds until they align. The problem (UT1 deviating from UTC by more than one second) would be the same as in this proposal.
> Literally nobody depends on any relationship between UTC and overhead sun angle.
That's just... completely incorrect and totally false? Have you ever even worked for a business? Have you ever read how time libraries are actually written?
It is literally built on the exact assumption that 0 means January 1, 1970 and that right now is (number of seconds in a day) x (number of days since Jan 1 1970). If we stop adjusting UTC, then by this time next year UTC will be one second out of date with our wall-clock times, and calling `datetime.now().isoformat()` will give us a timestamp that's 1 second off from the wall-time of a user. At one-second past midnight on the 20th of the month, your computer will incorrectly be spitting out timestamps saying it's exactly midnight of the 19th. That's what you might call a major breaking change.
Now expand this reasoning far beyond the scope of time keeping.
Big Tech companies/Anti Big Tech lobbyists massively oversimplify in their pitch to influential people to deregulate/overregulate certain areas. In both cases they end up making poor decisions for the general case both end up making the average case worse for everyone except themselves. It's about creating a market where none need exist. Facebook doesn't need to care about time really. It's not remotely important to their business.
I've built and worked on platforms with sub microsecond measuring requirements and this stuff didn't bother me. This is idle bad money finding work for itself at the expense of everyone else.
Disclosure: I am/was an early investor in facebook in 2012. Mark is turning it all to dirt because he's run out of ideas
If you stop thinking about time being wrong from what is officially correct, and instead see this whole exercise as a error minimization framework I think it is far easier to make the case for ending leap seconds as it is for keeping them.
This isn't just about lines of missing code. This is about forcing subterranean or submerged computers to surface. This is about out of sync clocks across information propagation networks across planets. This is about real lives that are ruined because time stamps didn't quite line up, causing delays, deaths, and needless headaches.
It doesn't need to be this way. We could just accept a minute of the clock being off from "true" midnight, which doesn't even make sense to me given that few people are right at the astronomic point where midnight is "true" midnight for their timezone. Heck, China is one big giant timezone so who is this actually for, really? The people that care about sundials? Most people don't even grow their own food.
We're no longer a sun-driven economy. Well coordinated timekeeping across devices that may not always be able to transfer data is far, far more important. If it's sufficiently wrong by the year 3422 then we'll deal with the fifteen minutes of annoyance then. This is a crazy premature optimization.
> Well coordinated timekeeping across devices that may not always be able to transfer data is far, far more important.
How do you have a well coordinated clock without being able to get four bits [1] per year of leap second data? It's hard to keep within one second of a time standard over 6 months or a year without communication.
[1] bit 0: was there a leap second in the most recent period, bit 1: was it positive or negative; bit 2: will there be a leap second at the end of the current period, bit 3: will it be positive or negative. Bike shed my fictious encoding if you like, but it's good enough. Use a whole 8-bits, go wild.
Cesium reference clocks can operate with accuracy around 10^-14 (aka 0.01 parts per trillion). In a year, a cesium clock would slip by a few tenths of a microsecond. That said, the whole "submerged computer must surface" thing is a bit of a red herring argument IMO. What use case would you have for needing to keep time in sync within seconds with the outside world, but being unable to communicate with that world? If you're trying to plan simultaneous delayed action across the world, it would suffice to merely be in sync with each other, leap seconds ignored.
The chip-scale atomic clocks were developed to support precise timekeeping for small devices that can’t communicate. One example is undersea sensor networks, where you want to leave the sensors in place for a year or more, and when you return you can correlate the readings from the sensors because you know they were all ticking at the same rate the whole time.
> It's frustrating that programmers want to redefine civil time just because it is "hard".
Yes. Problems with delay time going negative usually come from not using CLOCK_MONOTONIC for delay time. CLOCK_MONOTONIC is usually just the time since system startup. It comes from QNX (which, being hard real time, had to deal with this first), made it into the POSIX spec around 1996, and is now available on all major OSs. But there's still software that uses time of day where CLOCK_MONOTINIC is needed.
Then there's the smoothing approach. This document described Facebook's smoothing approach, which has a different smoothing period than Google uses.
* Facebook/Meta: "We smear the leap second throughout 17 hours, starting at 00:00:00 UTC based on the time zone data (tzdata) package content." This is puzzling. What does the time zone package have to do with UTC?
* Google: 24-hour linear smear from noon to noon UTC.[1]
* AWS: Follows Google.
* US power grid: Starts at midnight UTC and takes a few hours while all the rotating machinery takes 60 extra turns to catch up.
> What does the time zone package have to do with UTC?
The IANA TZ database includes information about leap seconds, and even supports the concept of "right" time zones in which the leap seconds are counted in the Unix timestamps. (Which violates the unix spec, and may cause problems with code that assumes it can do path like `1 day= 24*60*60`, but on the other hand, things like DST already make that unsafe).
It is mostly likely simply the case that they are using the leap second data from the time-zone database as a convenient source of this data.
I think that observation just lends further weight to the argument that the relationship between atomic time and universal time is a dynamic and unpredictable thing, which we need to handle correctly rather than pretending it doesn't exist.
That it is dynamic and unpredictable is exactly why we should not force everybody to track it.
Some people: astronomers and orbital mechanicos are obliged to care about sidereal time, regardless. Making me deal with it too is pure tax with exactly zero benefit.
>It's frustrating that programmers want to redefine civil time just because it is "hard". This article glosses over the real world problems that detaching from UTC will cause.
Yes, the actual problem exists, and ignoring/discarding reality (i.e. the "science" in computer science) will just cause further problems. If you and your modern stack of code can't handle the leap second, it's simply not production code.
Moreso: it is not a problem that dealing with the nuisance that is leap seconds even solves. It is supposed to match civil time to astronomical time, but astronomers don't use it. It just makes things even more annoying for astronomers, and annoys everyone else, over and over again, for no benefit to anyone.
Astronomy very much relies on the leap seconds. If they ever get abolished it will create lots of headache for all observatories (and hobby astronomers as well), since the telescopes will point more and more incorrectly as UTC drifts away from UT1 (the leap seconds ensure UTC is always within 0.9s of UT1).
To explain a bit further: UT1 tracks the Earth's rotation relative to distant quasars and is thus directly the correct clock/reference to use for pointing telescopes.
However it doesn't advance at a nice and stable constant frequency, but something that slowly changes over time (and can shift by strong earthquakes) and thus we approximate it with UTC, which runs at a nice constant frequency, but needs occasional correction to match up with Earth's rotation.
Because UTC already has no other purpose than to be what everybody is already using. The leap seconds in UTC benefit literally nobody. But being a standard is a purpose.
Changing to TAI means you are different from everybody else, and still have to fool with leap seconds to know what everybody else is using. Worst of all worlds.
(Except Google smearing, which is even worse than that.)
That's literally not true, since we astronomers do use UTC as it is intended (since within 0.9s of the correct time is good enough, but being many seconds off isn't anymore for many applications).
The argument that the legacy systems should maybe be updated is already being discussed elsewhere, so no need to rehash that.
Hardly. For actually observing with a telescope UT1 is the correct time scale to start the calculation from, since it's directly linked with Earth's rotation - with some complications that you have to calculate local sidereal time and so forth, but this only involves fixed constants; All TAI derived fixed-offset time scales are not linked with Earth's rotation and thus require constantly updated offsets. For most telescopes approximating it with UTC gives good enough results (pointing accuracy wise), so that's what many observatories do. And many smaller and older observatories operate quite a lot of legacy hardware and software that would need to be updated if the current UTC definition were to be changed.
Well, having worked on legacy systems it’s much easier to keep the existing protocol mostly unchanged than migrate the world to a different protocol. Even if the change is as “simple” as subtracting a constant integer everywhere. Just thinking about all the stored timestamps in all databases gives me a headache…
Because societal official/legal time is based on UTC and not on TAI. So the point is to change societal official/legal time, not just to use different time standard.
> If we end leap seconds, it doesn't take long - only until 2028 - until "midnight" is sufficiently far from "the middle of the night" that you will have to consider the legal issues caused by events that happen just before or after 0000 hours.
I'm not sure what you're getting at here. If we stopped introducing leap seconds, then why would the legal world still care about them?
I can believe that a desperate lawyer would argue the semantic distinction between clock-midnight and solar-midnight, but I have trouble believing that this would amount to anything more than one more dumb nit on a pile of dumb nits that the court has to deal with every day.
They can already argue that though, since solar-midnight is not the same as clock-midnight anyway due to timezones. Really, timezones already create this difference for the majority of people, and to a much larger degree than leap seconds likely ever will.
I'm honestly amazed to see so many people agree with this.
Timestamps are exactly what we define them to be. There is no correct and incorrect.
One option is to have a system with arbitrary unpredictable leaps to keep it synchronized to within 1 second of the mean solar time over Greenwich, England. Every computer system that has to deal with time accurately needs a lookup table for leap seconds that is occasionally amended, with only a couple months warning in advance.
Another option is to just let the clock run at a constant rate. In this case only astronomers have to keep track of the difference between solar time and clock time (which they already do anyway).
The fact that the difference will increase to an hour after several hundred years is utterly irrelevant. If people in the future care, they can simply adjust the timezone definitions to compensate, since timezones are already adjusted all the time.
When the sun is directly overhead it's meant to be 12:00 - IN THEORY!
However as Timezones are pretty wide, most of the time you'll be at least 15 minutes out. Sometimes you'll be out by as much as 3 hours - and you've probably never even noticed!
Telescopes already have to compensate for this (as well as for summer time).
Leap seconds make a shambles of book keeping too. What is "2022-07-17T12:00:00" + (60 x 60 x 24 x 365 x 5) seconds? No one knows! And the answer to that question will change depending on when you calculate it and which updates you installed!
So I say ditch the leap second and let it drift. In a few hundred years we could update our timezones if we _really_ want to (timezone changing is actually pretty common, so code should already be handling this edge-case).
> In about 600 years TI will be ahead of UT1 by half an hour, and in about 1000 years the difference will be a full hour.
That's nothing. Time zones alone already create significantly larger errors. Belgrade and Sevilla share a time zone, but the solar meridian ("noon" on a sundial) is 12:44 in Belgrade and 14:30 in Sevilla. Obviously, the same error is present in the astronomical "middle of the night". This does not, in fact, create "legal issues" for Serbs or Spaniards.
In 600-1000 years, around the time that it would actually matter, we're going to have to reform the time system anyway to account for relativistic drift between the surface of the Earth and human settlements elsewhere in the solar system.
There's no need to "detach" from UTC. Just ensure that TAI (which is consistently free of leap seconds) is also supported on an equal status to UTC, for applications where it makes the most sense. Conflating the two would only increase confusion further.
Programmers can already do this if they want to. TAI already exists. But they'd have to still display UTC as civil time to end users and I'm pretty sure they don't want to do that either because it would mean just as much code.
the hard thing about TAI is that it’s not properly supported in DBMS, RFC 3339/ISO 8601, etc… This makes it hard to use. It’s actually easier to use MJD represented as a double.
"Make two parallel time systems and allow conversion between one and the other programmatically" reduces spiritedly and unambiguously to "use one time system and care for it programmatically".
By precedent, UTC seems the logical choice for the one time system.
But the whole point of the OP is that UTC has leap seconds, which are hard to manage programmatically - and may even be impossible, wrt. future dates and times. That's literally the one relevant difference between UTC and TAI.
There is no need for UTC to continue inserting leap seconds. When they commit to stopping, everybody can relax: irritation removed.
Telling people to use TAI is telling them to have a different time from everybody else. The whole point of civil time is specifically that other people use it. Using TAI does not free anybody, because anytime you need to interact with outside, you are back in the nightmare.
Exactly. Discontinuing leap seconds is a 99.999999% compatible change.
I've tried to build systems using TAI they break down because: At some point you have to interact with something that doesn't use TAI and that fully reintroduces all the leap second issues, and because a lot of third party software has leap second handling, so the wheels fall off when you update some component and its embedded list of historical leapseconds now changes its behavior. Similarly, sometimes UTC time is all that's available and without the leap second data you can't back them out to get TAI.
And with leapsmear the challenges of backing out to TAI have increase substantially.
We live in a world where civil time moves by an hour 2x a year for no good reason.
You FAR overstate the impact on civil society of failing to change it by a second every so often.
Ironically even astronomers, who leap seconds were originally for, don't benefit because they need to know the Earth's rotation accurately to subsecond levels.
> By 2055, the "minute" displayed on a clock may be incorrect, which again may cause issues with legal timestamps.
I'm not following here. What defines "legal timestamps" in our current system? I'm unaware of any laws in the US that uses the actual position of the sun to determine the time.
"Noon" when the sun is at the highest point, can vary over an hour across a timezone.
Way more than one hour. Even without taking China into account, A Coruña in Spain and Kosice in Slovakia are in the same time zone but they are 30 degrees (2 hours) apart.
A birthday is a legal timestamp. A car crash is a legal timestamp. When the time is off by a minute, these events can’t be catalogued correctly any more.
Shifting the timezone by a couple seconds does not prevent or hinder cataloguing events in any way whatsoever, certainly not more than switching to daylight savings time does or the mere existence of timezones, which may easily be half an hour or even more off from the solar time - the offsets we use for time are effectively arbitrary already, and adjusting the arbitrary choice of the offset by some seconds is not a fundamental difference. Event timestamps already map to different days depending on different timezones, you do need to know which timezone your clock is using, of course, but you already need to do that.
For people born just around midnight, especially around new years eve, a few seconds could impact their DOB by a whole year. This could affect everything from university applications to boating licenses to social security.
Some countries have boating license laws that are different depending on whether your DOB year is >= 1980, as an example for this type of "grandfathering cutoff".
Maybe you seem to think that was is being asked for is to retroactively remove leap seconds from UTC? That is not the case, all that is being called for is to stop adding more leap seconds.
Both can easily be placed on same monotonic time. Actually makes a things simple. You don't end up having 31/12/1972:23:59:60 and wondering why is there 60 there...
“civil time” is also a construction that is flexible in many ways, so an influencial group redefining it isn’t out of norm. To note, timezones were introduced for railway purposes, and some country play a lot with them.
For “midnight” being far from “the middle of the night”, that’s already a reality for many Chinese living far enough from Beijing, or god forbid regions where “night” doesn’t mean much for half of the year.
For all intents and purposes, if a formal definition of time isn’t practical people come up with their own ways.
> it doesn't take long - only until 2028 - until "midnight" is sufficiently far from "the middle of the night"
Honestly from my perspective, 3am is the middle of the night (night-morning-afternoon-evening starts at 0-6-12-18 for me) and somewhere between 4 and 5 most people are probably asleep and the date change should occur. I can't count how often I've heard people clarify what 'tomorrow' means when the word is spoken after "midnight" but before going to sleep.
But yeah gotta pick something for the date change, it won't be worth the cost of change now. If we do end up ever switching to something like decimal time, this should be on the todo list though.
And I know "midnight" is historically supposed to be about the sun being the furthest from its zenith rather than in the middle between when you go to sleep and get up, however that occurs somewhere around 1am here (01:41 at its extreme, from July 17 till August 5th). If that's not enough to warrant a redefinition, 27 seconds accumulated since we started counting leap seconds are also not enough to warrant an update yet (following Facebook's logic here).
* "Most telescope pointing systems fail" (by 2027) (with 5s deviation from earth rotation). Pointing systems cannot blindly rely on UTC anyway, since (a) even with leap seconds UTC is up to 1 second off earth's rotation, and (b) pointing a telescope depends on where the telescope is on earth, so some offset must be added to UTC by some human.
* Hypothesized legal issues... give me a break.
It would be much less trouble for humanity to deal with this once every 100 years or so.
These "problems" are trivial. The day changes at midnight which is 12:00 AM by the clock. There is no ambiguity. Midnight is not literally the middle of the night. The minute on the clock will be correct by definition, nothing will change. Sundials are already wrong. You'll need to try a lot harder to convince me that this is a bad idea.
All these arguments based on sun position make no sense in a world where people already live in places where the sun literally never sets or never rises for months, and people already live in time zones offset many, many hours from "correct" time. The sky doesn't fall!
I don't see how you run into legal problems. The break from one day to the next still occurs at a well defined time, 23:59:59 + 1 second, or 00:00:00. Midnight isn't the middle of the night (or noon exactly at solar zenith), except on 15deg meridians anyway. What will happen is that over time, those "golden" meridians will shift slightly. The only people who will notice are those that are using time for celestial navigation. Terrestial navigation, which is almost entirely done with GPS these days, won't be affected at all (GPS already doesn't use leap seconds). And, yes, sundials will gradually get out of sync, and have eventually to be rotated on their axis to be right.
> only until 2028 - until "midnight" is sufficiently far from "the middle of the night" that you will have to consider the legal issues caused by events that happen just before or after 0000 hours.
I can't follow your logic here. In any relevant context midnight has a definition, typically UTC midnight in the applicable timezone. Eliminating leap-seconds would make the instant midnight occurs less ambiguous in 2028, because precise timing with leap-seconds is strictly harder than without. (and one can independently realize a time that closely follows TAI but one cannot independently realize UT1 without a VLBI radio telescope array, and one can't realize UT1-TAI without a datafeed because the decisions are subjective).
This isn't just a question of 'some lines of code'. Leapseconds cause widespread disruptions even when they don't occur, they cause security vulnerabilities (and slower and less secure systems because they make synchronization unreliable). People are widely deploying "leap smeared" NTP servers to try to prevent some of the worst synchronization faults, but doing so makes it impractical to back out leap seconds to derive TAI (or a more accurate TT) from the system's UTC, particularly because systems don't know if they're leapsmeared or not (and different smear sources use different smearing parameters).
Please consider that none of this actually matters if we ditched UTC for TAI. For one, time zones still exist and local solar time is already decoupled from clock time.
This article details some of the problems: https://www.ucolick.org/~sla/leapsecs/dutc.html
(You may want to scroll down to "Implementing the plan outlined at Torino".)
If we end leap seconds, it doesn't take long - only until 2028 - until "midnight" is sufficiently far from "the middle of the night" that you will have to consider the legal issues caused by events that happen just before or after 0000 hours.
By 2055, the "minute" displayed on a clock may be incorrect, which again may cause issues with legal timestamps.
And by 2083, sundials are measurably wrong.
All because programmers wanted to save some lines of code.