At least leap seconds will be eliminated by 2035, so there is that. Computers were never good at dealing with it. Keeping computer clocks in UTC instead of TAI and rewinding it for leap seconds was always insane. The same goes for UNIX time and NTP.
No. UTC with leap seconds isn't "reality" except in the sense that some of those awful scripted "reality TV" shows are reality. It's a confection, and it's annoying so there's no real argument for it beyond "That's what we've always done" which is a pretty dubious rationale for anything, but even more so when it's been like that for less than a normal human lifespan.
Leap seconds won't even be like the automobile, let alone newspapers or hotdogs, they're a fleeting idea we thought might be good, now we realise it's not good, so bye bye.
There are two "realities" here. TAI is the monotonically increasing "Seconds are the same length and happen in order" clock time that it turns out well suits a lot of human enterprises. The exact details have been worked out pretty well. A Day in TAI is 24 hours of 60 minutes of 60 SI seconds, always, forever.
UT1 is a descendant of "Solar time" now based on the Earth's rotational angle. A Day in UT1 is the full rotation, how long that is in SI seconds varies very slightly and arbitrarily.
UTC wants to have the steady monotonicity of TAI except it wants to match UT1's days exactly, this is not a thing, so the "leap seconds" are conjured to kludge it. This kludge is what's going away, it's not part of "reality" it's a kludge to try to fit a square peg in a round hole and we're giving up the kludge.
It has been a long time since humans worked purely from direct observation of the "solar time" or rotation. Once the clock gets invented, so millennia ago, that's out and humans are measuring (albeit not very precisely at first) what we now call TAI. Giving up leap seconds is us finally accepting that while UT1 is astronomically interesting, it's not actually a sensible basis for day to day living.
Weeks aren't reality. Timezones aren't reality. Julian/Gregorian calendar months aren't reality. But they are essential.
Calendars, centralised time, and timezones drive coordinated human socialisation and commerce, increasing the wealth of all nations.
Countries have switched timezones so that their "work day" matches another more prosperous country's "work day"
Humans want, and have always wanted, a compromise between their activities, and the unalterable reality of nature. We have "daylight savings time" because if we say "work is between 9AM and 5PM", that allows for the synchronised commerce - you can phone another business at 9AM because you both start work at 9AM, and 9AM = 9AM for both of you. But the reality is that in the winter months, everyone is going to work or leaving work in darkness, with no daylight time for socialisation. So you shift your entire country's (or state's) offset from an absolute clock, to turn "9AM to 5PM" into "8AM to 4PM" without actually stating that, because if businesses started varying their hours, they'd desync completely and you'd never get business done again.
It hasn't been that long since we accepted centralised time, only really since the invention of railways, and especially since the invention of the telegraph. Prior to that, local solar time ruled. Bram Stoker was particulary upset that Ireland was 25 minutes out of alignment with Great Britain (Dublin Mean Time vs Greenwich Mean Time), in his view Ireland was missing out on a lot of trade [0]
I don't think we're going to move to a time source that will slowly desync us from nature. If we liked being desynced from nature, we've have ditched DST by now - we haven't. And we also haven't gone entirely back to nature and made it 9:30 in Dublin when it's 9:55 in London. I think we're going to remain on this middle path - fudging the mixture of atomic time and alignment with solar time - for the rest of our days on this planet.
> Weeks aren't reality. Timezones aren't reality. Julian/Gregorian calendar months aren't reality. But they are essential.
And we ignore all of those in computer "system" clocks, and track them in a secondary local time layer, quite rightly. We should have handled leap seconds the same way.
> while UT1 is astronomically interesting, it's not actually a sensible basis for day to day living.
Hard disagree!
Allowing ones clocks to gradually drift from the behaviour of the Earth (which regulates our biological lives) is a mess. Once can compare the eventual correction that will be required akin to the shift from the Julian to Gregorian calendar and the problems that caused.
The drift (a second every ~1.5 years) is so small the accumulation is irrelevant to biological processes. 2000 years from now it does not matter that solar time has drifted ~an hour. Beyond "what people millennia ago used to do at 7 we do at when the clock says 8" not being a problem (assuming we even live similar lives) I'll be god damned if we can stop ourselves from changing zones and whether or not we'll observe DST this year for the next 20 years let alone what else we'll muck up in the next 2000.
TAI will still be there and needed by folks. UTC is stopping adding leap seconds in the future, not reverting back as if it never had any.
> The drift (a second every ~1.5 years) is so small the accumulation is irrelevant to biological processes.
It is irrelevant until it is not: just ask Julius Caesar and Pope Gregory XIII.
> Beyond "what people millennia ago used to do at 7 we do at when the clock says 8" not being a problem […]
The difference of an hour does make a difference, as sleep researchers and chronobiologists keep pointing out every time a discussion on DST comes up (it is not just about the sudden time jump, but also about the actual time):
If a difference of an hour that's been there your entire life is noticeable, we should be able to observe that as a discontinuity of quality of life at time zone boundaries.
> It is irrelevant until it is not: just ask Julius Caesar and Pope Gregory XIII.
You're refferring to needing to reform the entire calendar but this doesn't make sense in context of me explaining that the calendar itself never needs to be reformed in the first place. That the sun was a few degrees different in the sky for Julius Caesar when a modern clock reads 3 PM in his time is not inherently a problem as Julius Caesar wouldn't have the same norms as you do in terms of what wall time is acceptable for waking/sleeping/eating/working/etc. E.g. 20,000 years from now if a clock reads 3 AM during midday it' not a problem as society will have had 20,000 years to adjust 12 hours vs your referenced calendar reforms that changed everything overnight.
> The difference of an hour does make a difference, as sleep researchers and chronobiologists keep pointing out every time a discussion on DST comes up (it is not just about the sudden time jump, but also about the actual time):
Again, you're missing the forest through the trees - though in two different ways here. The first is that it's a minute over someone's (long) life, so what impact we feel when we change time by an hour twice a year isn't relevant. The second is that society, over 2,000 years, does not need to change timekeeping itself to wake up when the clock says 8 instead of 7. If the change were to happen over a short period then sure, it's not really feasible for society to move up what wall time their breakfast is four times a year or something, but an hour a millenia isn't even something society needs to consciously worry about.
My point is not that we don't have a biological clock, it's that the effect of leap seconds on a human's biological clock are too small to affect it. One of your sources already says it's 15-20 minutes misaligned a day, why are you using it to argue 1 additional minute for your entire life is impactful? On the long term societal scale my point is society won't always agree we should wake up when the wall clock says 7 am. That norm changing shifting ~an hour 2,000 years is not a relevant concern for changing the way we keep time.
> The difference of an hour does make a difference, as sleep researchers and chronobiologists keep pointing out every time a discussion on DST comes up (it is not just about the sudden time jump, but also about the actual time)
Do any of these sleep researchers have anything to say about France using CET instead of GMT even though CET is about an hour off from their natural time? Or Spain being on CET despite being almost another hour off from France?
Furthermore, how long is it going to take for the accumulated leap seconds to add up to a full hour of time? My understanding is that it’s on the order of centuries. If humanity can maintain an industrialized civilization that’s capable of keeping track of leap seconds for that long, most of us won’t even be living on the earth by the time it makes any difference.
> Human rhythmicity is subjected to the workings of the internal circadian clock, but it is also influenced by environmental time (mainly the light-dark cycle) and social timing imposed by the official time at our location, as well as by our work schedule. When a misalignment among these times occurs, an internal order impairment appears, which affects our health. Western Spain (GMT+1/+2) and Portugal (GMT0/+1) share similar longitudes (sun time) but have different official times, and thus they provide a “natural experiment” to assess how this discrepancy affects circadian rhythmicity and sleep in people with no work duties (>65 years). Although sleep duration was not affected, the circadian rhythms in the Portuguese were more robust, especially during weekdays, while higher desynchronization tended to occur in the Spaniards. Once official time was corrected by GMT0, meals took place later in Spain than in Portugal, especially as the day progressed, indicating the possible deleterious effect on circadian system robustness when official time is misaligned with its corresponding geographical time zone.
> A leap minute every few centuries makes much more sense than a leap second every few years. We have until our great-great-grandchildren to prepare.
People can't get the regular changes of DST and February 29 correct, and you want them to get a one-off change right?
I'd rather 'inflict' change (semi-)regularly so people at least try to get things right and get some practice, as opposed to a Hail Mary pass/change in some distance future.
We already have rules that are much rarer and more impactful than leap minutes would be. For example on Feb 29th 2000 an entire extra day was inserted on a century, an event that only happens once every 400 years! It was a complete non-event and only time nerds remember that it happened.
In the case of a leap minute, the worst that can happen is that your clock is 1 minute out every couple of centuries. Doesn’t seem so bad and certainly much better than dealing with leap seconds every few years.
We don’t need to constantly rehearse for such an event, we can just do nothing and die without worrying about it.
But when that leap minute does eventually occur, it's going to cause havoc in all the systems that don't handle it. Which, let's face it, there are going to be a lot of. Either because they were never designed for it or else the relevant code paths were never actually tested.
You could broadcast a leap minute with a decade to prepare for it and most software would still be developed, used, and die off in the 90 years in between. It's better for 15% of relevant software to have to worry about something so consequentially minor than 100%.
That didn't work so well for y2k issues. Just this month there was an article about a woman that keeps getting id'ed as being 1 year old instead of 101 years old because of a y2k fix.
If we are still using the same calendar in 60000 years, and if the Earth's Sun is still so important that we have to adjust our watches, we can start to talk about a leap day.
Granted, this is some 100 times longer than our calendar has lived. And some 10 times longer than any human calendar. So, I suggest we postpone the issue a bit.
Indeed, leap day, leap hour, leap minute, etc is a distinction without a difference because either way the solution is the same: do absolutely nothing and let future humanity decide how much drift it can tolerate.
It is not just about programmers, it is just that the Earth is not considered an accurate enough clock anymore. We can do better with atomic clocks.
For day-to-day operations, we don't need the exact position of the sun in the sky, in fact, with time zones, we can be off by hours and still do business. Using the sun would be inconvenient anyways, as each location would need its own time. So for that, a few seconds is completely negligible.
Not many people care about the precise position of the sun in the sky as seen from an arbitrary location, so there is no real need to skew our overall more useful atomic clocks for this.
Maybe, in a distant future, people will need to update their time zones to catch up with what would be a noticeable shift, big deal...
Human society screwed with biology in a big way since the industrial revolution. A few seconds, even minutes don't matter when business hours can disrupt sleep patterns of a significant part of the population by hours.
Terrestrial time don't serve society very well, as it is not very precise, it doesn't serve biology well either, as it is based on some arbitrary meridian instead of your actual location. Biology-compatible times would be in relation to sunrise and sunset, as it is sometimes done.
Leap seconds were accepted only because they were better than the previous solution (the "rubber second") and not enough people complained. It's just that we finally got enough complaints that they are being abolished. AND the end result is TAI being universally accepted, with a fixed offset due to the historical reason though :-p