The problem I've had is that even if you truncate the date to only use microseconds, Double ends up rounding it differently so comparisons fail. If we used a type that didn't have this precision you could clamp your date to microseconds/milliseconds and it should 'just work'
Arguably, it's less of an issue if Float / Double types exhibit this behavior; that is the nature of floating point numbers. The whole point of IEEE 754-style floating point numbers is that for many applications close approximations are good enough and therefore the speed benefits outweigh the inexactness problem. If you are aware of and agree to that tradeoff, then no harm done.
The fundamental problem with having a Date
type that wraps a floating point number is that dates shouldn't behave like floating point numbers. The issue mentioned above is just one manifestation of the problems this fundamental type mismatch causes.
A quick informal poll about how people use Date / want to use Date that will help inform consideration of the possible designs here:
"I need Date to be round-trippable to other formats down to _______ precision (i.e. if choosing seconds, I would expect a round-trip between Date and String to exactly preserve Dates with no fractional seconds component to be preserved exactly)"
- second
- millisecond
- microsecond
- nanosecond
- "arbitrary" (say, as accurate as Duration--attoseconds)
One special unit that isn't any of the above that I could see an argument for: if the system is interoperating with C# (say, an iOS app hitting a .NET server), C#'s DateTime
type has a resolution of a tenth of a microsecond (= 100 ns).
Personally, I don't ever expect anything beyond the millisecond to be very meaningful unless the API advertises itself as being high-precision and suitable for such measurements. That being said I would interpret TimeInterval
's documentation ("sub-millisecond precision over a range of 10,000 years") to make such a guarantee.
I think for this poll it is important to make a distinction between the existing Date
type being changed to be precise to a particular granularity vs adding a new type designed for much more precision (like ContinousClock.Instant).
I think it's perfectly reasonable that the builtin Date
has limited precision (even down to milliseconds) so long as there does still exist a type for the use cases that do require more precision beyond what the default Date
type facilitates.
My rationale for this is mostly a semantic thing where when I think about what a date is from a more "philosophical" standpoint, it isn't something that I would expect has arbitrary precision. For example, ISO8601 strings in the wild don't typically have precision beyond seconds or milliseconds.
But that doesn't change that I do want some type that allows me greater precision for representing of points in time.
So as far as "how people use Date / want to use Date" - I think it's more a question of for workflows with high precision needs, how much precision is desired? And this I think is reflected in the current majority answering arbitrary precision.
I'm mentioning this out because I can totally understand if changing the existing Date is kind of a non-starter - but I don't think that makes the answers that have been provided any less valuable.
(Sorry if I'm pointing out the obvious)
Well, if you head over to the pitch for Foundation UTCClock
, it's creating another distinct type that could live alongside Date
that is considered a non-starter. So if both are non-starters, well...
Would the issue here be 'solved' by adding a granularity-sensitive compare function to Date? Like:
date.compare(otherDate, granularity: .microsecond)
Then anyone can compare with whatever granularity works for their business logic and storage system…
Granularity comparison requires a Calendar to define the scales and ranges that correspond to each granularity, and a Date does not intrinsically have a Calendar. That's why granularity comparisons are instance methods on the Calendar type.
Open question: do we need the property of "linear growth" for Date?
Example that shows it's not linear now:
let a = Date() // now
let b = a.addingTimeInterval(0.0000002)
precondition(a != b) // ✅
let c = Date().addingTimeInterval(100*365*24*60*60) // ~ 100 years from now
let d = c.addingTimeInterval(0.0000002)
precondition(c != d) // 🛑
probably just because Date is backed by Double internally.