Date/time separation, further component separation inside those, different radices of those, arbitrary units, calendars, leap seconds and days, time zones, and finally awkward time formats – all those are arbitrary to various extent (some are loosely based on observed cycles), and lead to unnecessary complexity and bugs in software, occasionally inspiring programmers to get creative and make it worse with new encodings.
ISO 8601 was supposed to at least alleviate the time format issues, but perceived user preferences tend to be preferred over it, so at best it gets used in systems internally.
A way to share approximate timestamps between programs, even if
not a sane way to work with those, would still be useful, but
what we have now is incompatible ISO 8601 implementations. For
instance, the paywalled (and apparently copyrighted) text of the
standard says that a dot
And then there are profiles, particularly RFC 3339: it is supposed to resolve the ambiguities and generally be more useful for interoperability (and it succeeds at that), but it is quite restricteve, which may be a problem if you only need dates, or a higher precision.
The issue is not specific to this standard, since people do it with standards all the time, yet the ambiguity in the standard doesn't help, and neither does the paywall. Though it is still better than no standard at all: at least most of the time it is possible to guess what is wrong and find a workaround if it is known that given or accepted timestamps correspond to someone's idea of ISO 8601.
Yet another issue, also common among formats, is that it is not
easy to parse without a proper parser. One should use a proper
parser, but for most that is arcane art, for some it is
unnecessary and/or undesired dependency, and it seems like most
of the time parsing is done with regular
expressions, scanf(3)
, various string manipulation
and conversion functions. I used to think that sticking to
regular grammars may help to ensure correct implementations, but
apparently it will not help much either, while a strict and as
dumb as possible format probably would; there is inherent
complexity of time components, of course, but ISO 8601
introduces alternatives that could have been avoided.
A nice way to deal with time would probably involve natural units (e.g., Planck time) and a better model (relativistic), probably logarithmic scale and base 2 or 3. But using something like that now would just introduce a new, particularly weird format, with complex conversion rules, with implementations dependent on common units anyway. Perhaps the closest commonly used thing to it is Unix time, but opting even for that would likely introduce unnecessary incompatibility issues, and/or require frequent conversions, since the used time components are embedded into cultures – which do not seem like they should affect technologies any deeply, but they do.
It would have helped if projects did not reimplement it on their own, but used some common library with proper RFC 3339 (or full ISO 8601) printing and parsing (and perhaps other time-related functions), but as usual, it is not something that is likely to happen: native implementations are usually preferred, sometimes even "native" to a program in question.
I keep writing and removing a rant like this every few years (sometimes ranting about other units of measurement as well), since it's not a useful note, but perhaps should finally leave it here. At least it's not likely to become outdated in the observable future.
Though perhaps being paywalled is an even more frustrating aspect of ISO standards in general: there are ISO(/IEC/IEEE) standards for all sorts of things, yet they are not usable in many cases because of the paywall, so you have to look for alternatives or write them from scratch, which seems to contradict the point of having standards.