Decoding JSON fails on iOS 17: "Number <...> is not representable in Swift"

Hi folks,

I'm getting failures decoding json on iOS 17 that was successful on previous versions. The error is complaining that a number in the JSON is not representable in Swift, e.g.

dataCorrupted(
  Swift.DecodingError.Context(
    codingPath: [], 
    debugDescription: "The given data was not valid JSON.", 
    underlyingError: Optional(Error 
        Domain=NSCocoaErrorDomain 
        Code=3840 
        "Number 18.181818181818183 is not representable in Swift." 
        UserInfo={NSDebugDescription=Number 18.181818181818183 is not representable in Swift.})))

I imagine this is a float/double-representation issue -- possibly previous releases would silently truncate the value, I haven't checked back yet to be sure. It seems the right long-term fix is for me to get the backend to truncate its output to some more sensible precision.

But in the short term, is there anything I can change on the decoding/client side to get this through? I'm perfectly willing to lose precision in decoding, as compared to failing the entire JSON-blob parse.

I guess you can change your models from:

struct Model: Decodable {
  //var number: Float
  var number: MyFloat
}

And then create a custom decoding for your numbers:

struct MyFloat: Decodable {
  var value: Float

  init(from decoder: Decoder) throws {
    let container = try decoder.singleValueContainer()
    let stringValue = try container.decode(String.self)
    // use whatever method to convert the string into a number
    self.value = ...
  }
}

I guess you could also make a property wrapper, so you don't have to change all call sites to model.number.value.

struct Model: Decodable {
  @Truncating var number: Float
}

@propertyWrapper
struct Truncating<F: FloatingPoint>: Decodable {
  var wrappedValue: F
  
  init(from decoder: Decoder) throws {
    let container = try decoder.singleValueContainer()
    let stringValue = try container.decode(String.self)
    // use whatever method to convert the string into a number
    self.wrappedValue = ...
  }
}

Not tested

1 Like

Did you already try changing the property for that decoded value from Float to Double?
That value won't fit into a Float but it should fit into a Double.

I did try switching Float -> Double, I get the same error. And, extremely weirdly, I also get the same problem with a much lower-precision value: "Number 18.18 is not representable in Swift."

That's bonkers enough that I'm looking elsewhere in my stack to see if something completely off the radar is behaving oddly.

1 Like

I can confirm Floats and Double do decode properly in Swift 5.9 (Xcode 15).
The error "The given data was not valid JSON." seems to contradict the number thing. I suspect the number error is just the parser moving on, but the JSON is might be malformed in some way and that's the real issue.

2 Likes

Thanks for the custom-decoding suggestion. Signs are increasingly pointing to something else being the real problem and the number issue just being a knock-on consequence, but I appreciate the suggestion.

1 Like

Found the cause. It is a change in behaviour between iOS 16 and 17, but a rather subtle one that shortcuts in my own code happen to turn into the mess you see above.

The change in behaviour is, SingleValueDecodingContainer.decode(Int.self) when decoding a JSON literal such as 18.18 used to throw a DecodingError.dataCorrupted (saying "parsed JSON number <18.18> does not fit in Int"); it now throws a JSONError.numberIsNotRepresentableInSwift.

Mostly that's fine, because mostly when you call decode(Int.self) the kind of failure doesn't much matter to you. But I was using that call as a "probe" to see if I could decode an Int, and catching specific error cases that I knew about. Which didn't include this one, so my retry logic for when the probe fails didn't run. Entirely my bad, as the contract doesn't make any claims whatsoever about what errors may be thrown on failure.

The error in my code is interesting in light of the ongoing discussion of typed throws over on Evolution: I was treating an entirely untyped throw as if I had the safety guarantees of a typed throw.

Why would you want to do that?!

(You may ask why on earth I needed to do that. Story too long for this margin to contain, but I'm using it to capture arbitrary JSON blobs as a sub-part of a generally strongly-typed Codable model. Its a solution, definition not the only solution, and very likely not the best solution either.)

4 Likes

This logic is in Sources/FoundationEssentials/JSON/JSONDecoder.swift, which appears to only throw numberIsNotRepresentableInSwift, which isn't a good error in cases where the user is simply trying to decode the wrong type. You may want to file an issue on that repo with a link to any feedback you've filed.

3 Likes

Nice find, indeed a subtle change.

I guess your implementation would tolerate decoding, say, 18.0 or even 18.0000000000000000123 as Int.

1 Like