Can/Should Foundation.Date use a fixed-point representation?
Why I Care
Since the release of Swift 6, I've started running into round-trip equality issues with Foundation.Date through sqlite-nio and postgres-nio.
Specifically, when inserting a Date into a database (either Postgres or Sqlite) and then reading it back out, the retrieved/decoded Date will fail equality (==) with itself.
This problem only manifests itself on Swift 6 on Linux (Docker swift:6.0.3 is what I used).
It does not occur on macOS.
This is ultimately occurring because the database backends have precision caps.
On macOS, Date is a double precise out to microseconds.
On Linux, Date is a double precise out to nanoseconds.
Because of this, when Date is converted to a database compatible value it loses some precision on Linux which causes it to fail round-trip equality with itself.
Now, it's easy to write this off as a limitation of the database backend - which is fair.
But I believe this problem could still manifest even if the databases in question supported greater precision values simply because the Date internal timeIntervalSinceReferenceDateDouble has to undergo floating point math in order to be converted into whatever scale the database supports. This floating point math can in certain cases incur rounding error which would also result in the same issue with round-trip inequality.
Beyond that, I'm somewhat concerned about the effect of the differing platform implementations that cause this error only to manifest on Linux but not on macOS.
Detailed DiagnosisReproducer
The following will demonstrate the exact same test being run on the host (in my case macOS 15.1.1 M1 ARM) and on the swift:6.0.3 Linux docker environment.
You will note that on macOS, the tests pass - whereas on Linux they will fail.
git clone https://github.com/vapor/postgres-nio.git -b 1.22.1
swift test --filter Date_PSQLCodableTests.testNowRoundTrip --package-path postgres-nio -q
docker run -v $(pwd):/root -w /root swift:6.0.3 swift test --filter Date_PSQLCodableTests.testNowRoundTrip --package-path postgres-nio
git clone https://github.com/vapor/sqlite-nio.git -b 1.10.5
swift test --filter SQLiteNIOTests.testTimestampStorage --package-path sqlite-nio -q
docker run -v $(pwd):/root -w /root swift:6.0.3 swift test --filter SQLiteNIOTests.testTimestampStorage --package-path sqlite-nio
public init?(sqliteData: SQLiteData) {
let value: Double
// We have to retrieve floats and integers, because apparently SQLite
// returns an Integer if the value does not have floating point value.
switch sqliteData {
case .float(let v):
value = v
case .integer(let v):
value = Double(v)
case .text(let v):
guard let d = dateTimeFormatter.date(from: v) ?? dateFormatter.date(from: v) else {
return nil
}
self = d
return
default:
return nil
}
// Round to microseconds to avoid nanosecond precision error causing Dates to fail equality
let valueSinceReferenceDate = value - Date.timeIntervalBetween1970AndReferenceDate
let secondsSinceReference = round(valueSinceReferenceDate * 1e6) / 1e6
self.init(timeIntervalSinceReferenceDate: secondsSinceReference)
}
public var sqliteData: SQLiteData? {
.float(timeIntervalSince1970)
}
Note: I cheated just a bit here because I took the above reproducer and edited it with an explicit dependency/import of FoundationEssentials. I did this so that I could actually inspect what was happening with Date initialization. By default, the Foundation code that is executed on macOS is not identical to FoundationEssentials - but the behavior remains the same regardless of whether I use FoundationEssentials or not.
Date's initializer boils down to this code:
@available(macOS 10.10, iOS 8.0, watchOS 2.0, tvOS 9.0, *)
extension Date {
private static func getCurrentAbsoluteTime() -> TimeInterval {
#if canImport(WinSDK)
var ft: FILETIME = FILETIME()
var li: ULARGE_INTEGER = ULARGE_INTEGER()
GetSystemTimePreciseAsFileTime(&ft)
li.LowPart = ft.dwLowDateTime
li.HighPart = ft.dwHighDateTime
// FILETIME represents 100-ns intervals since January 1, 1601 (UTC)
return TimeInterval(Double(li.QuadPart) / 10_000_000.0 - Self.timeIntervalBetween1601AndReferenceDate)
#else
var ts: timespec = timespec()
clock_gettime(CLOCK_REALTIME, &ts)
var ret = TimeInterval(ts.tv_sec) - Self.timeIntervalBetween1970AndReferenceDate
ret += (1.0E-9 * TimeInterval(ts.tv_nsec))
return ret
#endif // canImport(WinSDK)
}
}
On Linux, clock_gettime returns a value precise out to nanoseconds.
On macOS, clock_gettime returns a value truncated at microseconds.
Prior to Swift 6, the Foundation Date initialization used gettimeofday (see this) which returns at microsecond precision.
On Darwin platforms, ABI compatibility limits the changes you can make to Date's internal representation ← this is incorrect, Date is not @frozen
On all platforms, API compatibility limits the changes you can make to the APIs that Date provides and uses
Effectively: the only real way to do this would be to keep Date's Double ABI layout and type, but find a range of bits in the representation that would never be used in a valid date (e.g., the NaN range), and use the remaining bits to store a fixed-point representation.
Double being 64-bits, though, leaves you with very little room for that fixed-point representation.
It's not a great answer in all circumstances, but in general, strict equality for Dates isn't a terribly useful operation since, just like with floating-point numbers, there's typically a precision beyond which the comparison is meaningless. e.g., depending on your domain, it might make sense to compare two Dates down to the day, hour, minute, or second.
(Though, yes, this representation limitation can make it difficult to use Dates as, e.g., dictionary keys.)
Your best bet, if possible, is to compare your Date values at a time scale that makes sense for your domain.
I've had Date problems since day one on Linux, more specifically it can sometimes return the wrong date when using Date.now. I never use Date in a way that would require such precision or in a production-stable context (like databases).
We've been discussing this in Vapor and on the Swift Slack as well. But essentially Date is not fit for purpose in Swift 6. Swift 6 introduced added precision which has hit up against Double precision errors causing all sorts of issues and changed the underlying sys call to get the current time.
The trouble with this approach is that the API really pushes you to just compare the date directly. It's much easier to just do if date == someOtherDate than to break it down into different components.
Additionally I don't think it's an unfair expectation to create two dates from the same TimeInterval and compare them and expect them to be the same - this does not always happen due to rounding issues with doubles and has caused a lot of issues in things like tests that used to work in Swift 5.
Ideally Date would switch the internal representation of time to something that doesn't arbitrarily have a loss of precision. Storing nanoseconds since 1970 as a Int64 would give you a date range up to the year 2262 which isn't great. Int128 would be better but I'm unsure if there are any compatibility/performance issues with using it.
This could be done in a purely additive way, by changing the internal representation and keeping the existing API and converting between the two with new APIs where it made sense (such as nanosecondsSince1970). That should be possible without breaking the ABI right?
Here is the technical documentation on NSDate. It appears to explain the following
Date and Time Programming Guide - Date Fundamentals
The standard unit of time for date objects is floating point value typed as NSTimeInterval and is expressed in seconds. This type makes possible a wide and fine-grained range of date and time values, giving precision within milliseconds for dates 10,000 years apart.
Two Dates created from the exact same TimeInterval will necessarily compare equal, unless the TimeInterval is a NaN — the issue is when you take the Date's floating-point representation and start performing operations on it without accounting for possible loss of precision, then converting back.
But yes, this is unfortunately too easy to do without realizing it.
Unfortunately, no — ABI stability means that the size, alignment, and memory layout of Date is fixed: if I compile a Swift function today which takes a Date, it necessarily requires the size and alignment of the value passed in to be the same as Double; if you call that function and pass in a value of a different size or alignment, it'll be reading garbage data from memory. Date can't change in that regard, because you then couldn't call any pre-existing compiled code which took Date values (an ABI break).
Edit:Date is not @frozen on Darwin platforms, so this does not apply
Hence the suggestion that it would be possible to do if you fit some representation of Datewithin the Double, but that leaves you with even fewer bits to work with. (e.g., encoding an integer value inside of a Double NaN payload gives you 52* bits to work with, which is even more limiting)
(* as long as they're not all 0)
This sounds like a bug worth reporting! If there was a change in behavior that's caused Date to provide values that aren't useful, then it'd make sense to investigate.
This is not a general requirement of ABI stable Swift types; it is only the case when the type is @frozen. For non frozen types, the layout of types is undirected through the runtime, this allows authors to add and remove properties (not accessors) from types without changing the type's ABI.
Good callout, yes! Resliient types can still be evolved under ABI stability; IIRC Date is effectively @frozen on Darwin platforms, but if this isn't the case then nothing I've said here actually applies and there's room for evolving the type!
As far as reporting goes, I'm hoping this thread does just that - if only in part. @0xTim would have to confirm, but - again - for point of clarity (at least so far as my investigation has gone):
The salient change in Swift 6 is the use of clock_gettime instead of gettimeofday to initialize the Date structure. Perhaps an amendment back to gettimeofday would be an appropriate change since it would reduce precision back to the microsecond range as it was prior to Swift 6.
I still don't love that solution since it doesn't address the underlying problems with representing a Date as a Double - but it may be an appropriate fix for the time being.
The ABI stability question is certainly an unfortunate one - thanks for bringing that up @itaiferber.
Now I'm wondering about the possibility of a new type to be introduced into Foundation that maintains the representation returned from clock_gettime or gettimeofday (two integers)?
Then Date can be documented with this limitation and an alternative provided for use cases that have needs related to precision.
Do you have a source for that? Looking at the declaration of Date, it does not appear to be frozen. On the other hand, TimeInterval is a typealias for Double so that cannot be changed.
I think introducing a new type (ideally in the standard library) might be the best way forward to prevent issues with backwards compatibility. Additionally, we could then use it with Duration, Clock, and InstantProtocol, and get the previously proposed WallClock. Timestamp would be a lovely name for such a type, IMO.
The one thing I'm pretty sure we can't change about Date is its representation in encoded form (usually as a Double, in property lists or keyed archives). Even if we choose to change its underlying representation to be integer based, or Int128 based - we still have to read it in and write it out as a double, which will almost certainly result in the same kind of precision errors described here.
@scanon has actually been looking into what options are available here for Date's representation.
I would love to see a new type introduced to replace the poorly-named Date type. Timestamp and Instant (to pair with the InstantProtocol) would be my top suggestions.
The other fundamental problem with using a floating point number as the basis of date objects is that floating point math is not guaranteed to follow the rules of associativity. (For example, under the normal rules of math, (a + b) - b should equal a but that is not guaranteed to be the case with floating point numbers.) So in addition to the rounding issues causing incorrect comparisons, if math is involved at any point that will also likely cause incorrect comparisons.
Alternatively, it might make sense to have two separate integer values, one for the number of whole seconds since the epoch of choice and another representing the number of nanoseconds to add to the time represented by the first number. Or, well, there are many other implementations from other languages that we could learn from.
I definitely agree, there's too much code out there that depends on the current representation / behavior of Date; a new type is definitely needed. And yes, please, add it to the standard library / Foundation; that's the only way it would get traction in the broader community.
I know it's tempting to go for a new type, but we should see what other options are available first. Introducing a new type for something as widely used as Date will mean a very long tail of needing to work with both.
It is not the case with float/double. Both Sqlite and Postgres can store float/double at full fidelity (it's just a bit pattern after all).
The reason it is a problem with Date is because the double that represents the Date has to be converted into integer-space and/or shifted to the database native reference date (2001 or 1970). Performing either of these options may result in truncating or rounding error - which changes the actual value being stored such that when it is read back, it is no longer the same.
If in your use case storage has to be in the database-native date format (rather than the raw bit pattern) and that format doesn't support sub-microsecond precision, then on any platform that supports sub-microsecond precision you're going to have this roundtripping problem regardless of what Swift does with its internal representation or serialization, no?