I feel like they are making a massive mistake by allowing UTC offset in the ISO8601 part of the timestamp. They try to extend the existing format in a seemingly backwards-compatible way, but ignore the fact that the existing format is fundamentally broken.
A stamp like "2022-07-08T00:14:07+01:00[Europe/Paris]" should never be a thing. It's invalid. Downstream code can treat it as either "+01:00" or "Europe/Paris" (which at that point is on +2:00), so you are begging for data corruption. The whole concept of "inconsistent timestamps" should not exist!
The "2022-07-08T00:14:07Z[Europe/London]" for "convert this UTC timestamp to whatever that happens to be in local time" part is probably a mistake as well, because it's going to seduce less careful developers to store timestamps as UTC. If it's a fixed point in time which can be stored as UTC, it shouldn't need a timezone: just let the frontend convert it to whatever timezone the user prefers. If it's a local timestamp, it shouldn't be stored in UTC in the first place.
It would make a lot more sense to just stick to "2022-07-08T00:14:07[Europe/Paris]" local time + named timezone format, without a numeric UTC offset. If your application doesn't support the timezone tag, it cannot parse it and it cannot meaningfully make use of it. End of story. Want to store a fixed point in time, for something like logging? Stick with "2022-07-08T00:14:07Z".
> A stamp like "2022-07-08T00:14:07+01:00[Europe/Paris]" should never be a thing. It's invalid. Downstream code can treat it as either "+01:00" or "Europe/Paris" (which at that point is on +2:00), so you are begging for data corruption. The whole concept of "inconsistent timestamps" should not exist!
The spec states you would ignore the [Europe/Paris] in that case, no? Not either or?
Also, the specification states you can opt in to the behavior that you are asking for by marking it as critical?
No, the spec states that you MAY resolve the inconsistency. For non-critical tags it would be acceptable to interpret it as either "2022-07-08T00:14:07+01:00", "2022-07-08T00:14:07[Europe/Paris]", throw an error, or ask the user for clarification.
Throwing an error is the only thing you can reasonably do because you're dealing with invalid data. Yes, there is the "critical" flag, but that's not helping much. There should not be an opt-in "don't corrupt my data" flag because there aren't any scenarios where you don't want it to be set.
Having the option with an opt-in flag should never happen. Having it with a mandatory critical flag is a bad idea, but not horrible. But if you're designing a new standard, why not just avoid the whole problem altogether by simply not introducing the whole concept of inconsistent timestamps in the first place?
This is a valiant attempt at resolving an extraordinarily difficult problem, but it falls far too short. Just as an example, I am very happy to see that they address alternative calendars such as Arabic and Hebrew calendars. But they only address representing specific points in time as Gregorian dates, and then suggesting to alternate
-calendar-aware applications that these specific Gregorian dates should be shown in this alternate calendar. This does not enable, eg, setting somebody's birthday as 18th of Tishre and having the calendar show an alert every year. Nor does it allow setting a date on e.g. 25th of Shaaban, considering that we don't yet know on which day the 25th of Shaaban will fall on the Gregorian calendar.
The only real problems that this RFC solves, are problems that have already been solved in extensions to ISO 8601, for example in popular Java libraries.
> Nor does it allow setting a date on e.g. 25th of Shaaban, considering that we don't yet know on which day the 25th of Shaaban will fall on the Gregorian calendar.
What's an example this would matter for? In my head, you wouldn't/couldn't store "a currently semi-arbitrary undefined time between X and Y" as a timestamp, because it isn't one, and can't be used as one. I don't think timestamps should handle Easter either, it feels like two very different domains.
This would matter for literally anything that you would schedule time for - meetings, doctor appointments, school activities, etc. Not all people use the Gregorian calendar in their daily lives.
The time of the meeting is not arbitrary - it just corresponds to a date in a different calendar. Some calendars, e.g. the calendar used in Saudi Arabia, depend on observation of the new moon.
Before that observation (and importing that event into your application!), is it not semi-arbitrary, as far as the code is concerned? A fully defined point in the future (currently) corresponds 1:1 to a unix time, or a timestamp from this RFC, almost by definition.
If you can't calculate a delta from $now, or "are $X and $Y the same moment in time" , I don't think "timestamps" are the tool. Seems beyond the scope of RFCs like this, at least to me.
Without guaranteed internet access, how would you set an alarm for "this day next year"? Timezones changing already cause problems there, but those were least correct at one point - not a "quantum timestamp" of sorts.
You have the same problem in Gregorian calendar, no?
How many leap seconds will occur between now and when Europa Clipper intercepts Jupiter? That's actually important to know, e.g. for scheduling radio telescope time.
And let's not forget that some Pacific island nations have actually skipped entire days to just to another side of the date line.
That's what the end of my last comment was trying to touch on - timestamps are always imperfect for certain edge cases, but I would say that scheduling radio telescope time probably doesn't meaningfully benefit from trying to latch itself on to this RFC either, in that case?
The problem of "All future dates beyond the current year are currently unknown" is unsolvable in a very different way to "cultures and systems might change in the future".
(Though, wouldn't you be fine in your example? "X time after TS" wouldn't change, right? If you need atomic time anyways.)
I meant that "Cultures and systems might change in the future" is the current/proposed state, and "All future dates beyond the current year are currently unknown" is unreasonable.
1.) Add an alternate way of specifying the time zone, which is possibly inconsistent with the time zone already specified in the time stamp, so there are multiple ways this might or might not be handled. Roughly 1/3 of the RFC seems to deal with that.
As the RFC tries to avoid saying: Do not use RFC9557/IXDTFs under any circumstance for any kind of information which will be used for security purposes; implementations will disagree what time stamp an IXDTF represents.
2.) Saying which calendar the time stamp actually refers to.
This does not provide a way to fix the "I'm actually referring to a fixed calendaric event and if the time zone changes, I don't want that event to shift" problem.
Adding a human readable timezone string in addition to the existing numerical offset is just creating opportunities for inconsistencies between the numerical and string values. Now there is extra special syntax for whether the application MUST act on detected inconsistencies, or whether it's safe to ignore it. Better to avoid defining an inconsistent format in the first place.
Projecting the timestamp into a specific calendar seems more like a localization problem than something that should be specified in the timestamp itself.
Adding support to bundle custom metadata key:value pairs into timestamp strings also seems misguided? Let timestamps be timestamps, if you need a more use a proper type instead of passing around timestamp strings.
edit: I can see the reasoning for the proposal, but I think it just doesn't get there. On the one hand it adds some confusing complexity to what is otherwise a very simple format to deal with. On the other, it's clearly not going to be sufficient for developers of complex non-gregorian/multi-calendar applications.
Can anybody explain why we should just not do what unix and gps do and use an agreed upon epoch, and let all the front ends make the conversions?
I genuinely dread human readable timestamps inside backend code and always use unix time. Even in front end adding the whole human readable time stamp takes almost 60 characters out of the logs.
I often look at timestamps in human format and wonder what time zone is this, and does the computer even have a synced clock. A week ago I debugged a 2FA issue where one intervenient did not have NTP configured and the clock was off by some random 20 minutes, and thus tokens were emitted expired.
It is the 8th of December 2024. I schedule a meeting for the 1st of August 2025 at 09:00 local time.
Your calendar app looks up my timezone. I live in Europe, so at the 1st of August 2025 I'll be observing summer time, so I'll be using CEST - which is UTC+2:00. You convert it to the Unix timestamp 1733658933 and store that.
It is February 2025. The EU passes legislation to abolish summer time.
It is March 2025. Your app gets a timezone file update.
It is the 1st of August 2025. Your app retrieves the timestamp for the meeting, which occurs at 1733658933. You convert it to my local time. I live in Europe, so I'll be using CET, which is UTC+1:00. You convert it to the 1st of August 2025, 08:00 local time.
I am an hour early for my meeting. It is your app's fault. All timestamps converted from CEST are incorrect. You didn't store the original timezone, so you have no way to correct it. You get thousands of angry emails, and everyone abandons your app.
As far as I can tell, the parent is asking the even simpler version of this question, which is "why not just store everything as a Unix timestamp", so even just the obvious example suffices:
I set a 9:00am alarm and then move to a new time zone.
If they meant "why not store everything as a Unix timestamp after pencil-erasing the time zone and replacing it with +0000", then you'd need the more complex example, but then the reference to GPS wouldn't make sense.
Or, even simpler, you ask for monthly total, and backend has no idea what "monthly" means, because it started/ended at undetermined local hour that has been discarded when converting to "utc"
It is somewhat ironic that the date reference in use throughout that page is of the form "April 1987”.
I greatly respect the effort to make a well-thought-out timestamp that balances readability and cultural neutrality and versatility across contexts and eras. I try to use at least ISO–8601 timestamps when I can.
It sometimes feels like a losing battle though. Even if a luser can roughly guess what
1996-12-19T16:39:57-08:00[America/Los_Angeles]
means, it's never going to make it into the UI. The trend is for "About two hours ago" or "Last week". That's fine for the Fuzzy Clock in your task bar maybe, but precise timing is as important as it it unpopular.
I don't know what the answer is here. Maybe a "time reference" object that your interface can interpret dynamically to your preferences and needs?
> Even if a luser can roughly guess what $ts means, it's never going to make it into the UI.
It's not intended to. It's designed for standardized machine-to-machine communication, which is currently a mostly unsolved problem for zoned timestamps.
Human readability does not mean end user readability. It's good if humans can read it so that programmers/admins can read it, and also end users if they happen to be exposed to it. But for display to end users, it should always be formatted in a localized, naturally readable manner with no concern for machine parsing (which is what the second quote is referring to).
A stamp like "2022-07-08T00:14:07+01:00[Europe/Paris]" should never be a thing. It's invalid. Downstream code can treat it as either "+01:00" or "Europe/Paris" (which at that point is on +2:00), so you are begging for data corruption. The whole concept of "inconsistent timestamps" should not exist!
The "2022-07-08T00:14:07Z[Europe/London]" for "convert this UTC timestamp to whatever that happens to be in local time" part is probably a mistake as well, because it's going to seduce less careful developers to store timestamps as UTC. If it's a fixed point in time which can be stored as UTC, it shouldn't need a timezone: just let the frontend convert it to whatever timezone the user prefers. If it's a local timestamp, it shouldn't be stored in UTC in the first place.
It would make a lot more sense to just stick to "2022-07-08T00:14:07[Europe/Paris]" local time + named timezone format, without a numeric UTC offset. If your application doesn't support the timezone tag, it cannot parse it and it cannot meaningfully make use of it. End of story. Want to store a fixed point in time, for something like logging? Stick with "2022-07-08T00:14:07Z".