Unexpected decoding behavior of Int64

When decoding an explicit signed 64 bit value (Int64), if that value is smaller than Int64.min, I am seeing the value being clamped to Int64.min.

I am running this in Xcode 15.3 / M1Pro / Sonoma, but the same behavior happens in 15.2. Not sure when this was introduced, but we have some unit tests now failing due to this.

For example, if my JSON contains "value" = -9223372036854775809, the decoder will assign my struct property -9223372036854775808

I believe this might be unexpected because decoding an Int32 with a value less than Int32.min throws a JSON Decoding error:
"Number -2147483649 is not representable in Swift."

Performing a similar decode with JSONSerialization doesn't suffer from this, although it exhibits the classic integer overflow (circling back to the positive value)

This behavior isn't present when testing against Int64.max, as that will throw an error as expected.

In my experimentation I am finding that JSONSerialization is decoding the value as NSDecimalNumber, but for a value larger than Int64.max it is represented as NSNumber.

Here's some playground code that can demonstrate:

do {
    print("-----  MIN 64-Bit Int: \(Int64.min)")
    let json = "{ \"value\": -9223372036854775809 }"
    struct Tester: Codable {
        let value: Int64

    // JSONSerialize
    let serial = try JSONSerialization.jsonObject(with: Data(json.utf8)) as! [String: Any]
    let value: NSDecimalNumber = serial["value"] as! NSDecimalNumber
    let v: Int64 = value.int64Value
    print("Type: \(type(of: value)) - value: \(value) - Int64 rep: \(v)")
    // Codable
    let tester = try JSONDecoder().decode(Tester.self, from: Data(json.utf8))
    print("Clamped Value: \(tester.value) - Int64.min: \(Int64.min)")
} catch {
    print("Error: \(error)")

Can anyone else confirm my suspicions or point me to an explanation of why this might be an exceptional case for Int64?


What OS are you testing on? 2023 Apple OSes use the open Foundation JSON implementation, so you could inspect that to see.

1 Like

running right now on iOS, tvOS, visionOS and iPadOS simulators and playgrounds...

I'm perusing the foundation source now but not seeing anything in particular that stands out.

I'll try the same code on a device as well as linux. if it's a simulator issue that would be a starting point.

edit: same on devices and Mac OS CLI code...

The boundary is between -9,223,372,036,854,776,832 and -9,223,372,036,854,776,833; the former erroneously decodes while the latter throws the expected "not representable" error.

1 Like

I tried both + and -9223372036854775809 – JSONDecoder correctly fails with error for me (macOS 13.6, Swift 5.9).

JSONSerialization behaviour could be different as it doesn't know the end type of this number (could be a Double, etc).

I tried in an Xcode project, not sure if it behaves differently compared to playground or not.

1 Like

For what it's worth, that translates to Int64.init(exactly: Double):

Int64(exactly: Double("-9223372036854776832")!) // -9,223,372,036,854,775,808
Int64(exactly: Double("-9223372036854776833")!) // nil

Yes, I think this is a bug in Foundation. Specifically, I'm guessing in unwrapFixedWidthInteger (I tried catching this in lldb but was told that none of the relevant methods exists in Foundation… so I guess their symbols were stripped :confused:).

That method tries to do the correct conversion, by interpreting the input string as an integer. When that correctly fails (due to underflow), it inexplicably falls back to _slowpath_unwrapFixedWidthInteger which instead tries to interpret the number as a Double and then converts that to the target type (Int64 in this case).

The interpretation as a Double is very permissive - it doesn't care about accuracy. It basically rounds the value to -9.223372036854776e+18. The Int64(exactly:) initialiser is apparently fine assuming that means Int64.min (whereas for -9.223372036854778e+18, the next highest Double value, it is not).

The bug seems to be the existence of _slowpath_unwrapFixedWidthInteger - it's only called once it's already been proven that the value isn't valid, so it serves no purpose.

Well, that said, it might be that this weird 'fallback' is trying to permit the number to end with pointless fractional digits, i.e. a decimal (.) followed by zero or more zeroes (0). I guess that's technically still fine, by JSON standards. Still, even if that is its purpose, that'd be much better implemented properly in the integer parser, rather than the hacky re-route through the Double parser.

Bear in mind that JSON integers beyond 2^53 are non portable at best.

1 Like

that’s again, a limitation of this particular JSON parser. JSON integers have unlimited precision.