Overriding default JSON encoding of floating point values?

Using JSONEncoder/JSONSerialization, floating point values will be encoded to JSON by leaving out the fractional part if it's zero (.0), ie:

struct S: Codable {
    let a: Double
    let b: Double
}
let s = S(a: 1.5, b: 2.0)
let jsonData = try! JSONEncoder().encode(s)
let jsonString = String(data: jsonData, encoding: .utf8)!
print(jsonString) // {"a":1.5,"b":2}  <-- b's value is encoded as 2, not 2.0

Now, in the project I'm in, the backend has a system in place (that cannot be changed) that uses "strict validation" in which floats (defined in Swagger as "type": "number", "format": "float") must always contain a fractional part, ie the above "b": 2 will result in a bad request (since the server requires it to be "b:": 2.0).

Is there any way to "fix" this in the app (other than implementing a JSONEncoder replacement)?

You could create an extension to JSONEncoder. I only tested this very quickly in Swift's REPL, so take my code with a grain of salt.

First you create your own data type to represent your Double with extra precision:

struct FixedPointNumber {
    var integer: Int
    var fractional: Int
}

Then you create an extension to JSONEncoder where you handle the encoding of your data type:

extension JSONEncoder {
    func encode<FixedPointNumber>(_ value: FixedPointNumber) throws -> Data {
        print("Have this here to see if your code gets called.")
        return Data(/* your magic happening here */)
    }
}

Then test it as follows:

print(String(data: try encoder.encode(FixedPointNumber(integer: 2,
                                                       fractional: 0)),
             encoding: .utf8)!)

Consider adding a small fraction of a mantis, like 10^-6 to integer numbers.

This task prompts to have a new encoding strategies like so:

/// The new strategy to use for encoding floating point values.
public enum FloatEncodingStrategy {
    /// Default behaviour. Matches suppressFraction
    case `default`
    
    /// always with fraction ('1.0' instead of '1')
    case withFraction
    
    /// suppress fraction when possible ('1' instead of '1.0')
    case suppressFraction
}

/// The new strategy to use for encoding integer values.
public enum IntEncodingStrategy {
    /// Default behaviour. Matches noFraction
    case `default`
    
    /// always with fraction
    case withFraction // ('1.0' instead of '1')
    
    /// no fraction
    case noFraction
}

I'm afraid I don't see how this could work. Even if your print statement gets called when you do:

But you're just calling the generic method you added, which has an unconstrained generic type parameter that happens to have the same name as (but is in no other way connected to) your FixedPointNumber struct (which is not Encodable).


I don't think this would work as a general solution. We would somehow have to ensure that every floating point property x in every request model would be mutated like so: x = x.nextUp before being JSONEncoded. Also, this would not transform 2 into 2.0, but rather 2.0000000000000004.

Btw, has anyone been in a similar situation, ie having to use a service that will accept 2.0 but not 2 (because the property is defined as "type": "number", "format": "float" in a Swagger file), and if so how did you end up solving it?

It's the first time I've come across this requirement in an API, I was surprised that I got a 400 bad request with an error message telling me that it's not a float if the fractional part is not included.

Would you say it's a reasonable requirement?

Even if it's not a common or reasonable requirement, I think it's unfortunate that there seem to be no straight forward way in Swift/JSONEncoder/JSONSerialization to specify whether a floating point value should be encoded as 123 or 123.0.

This per se is easily done with your CustomDouble for which your define what func encode(to encoder: Encoder) throws is.

The server is acting funny insisting on a non standard JSON formatting, as there is no such a thing as int/float split in JSON spec, hence your problems. Your options are limited:

  • change your server
  • accept the "0.0000001" or whatever is the tolerance.
  • modify your data type to output: { "num_int": 2 } when the number is about to be encoded is close to integer or { "num_float": 2.3 } when it is not.
  • write your own JSON encoder - I don't believe Apple's is customisable enough to achieve this non standard behaviour, so you'll have to write everything. It's not that complicated, would fit on one or two pages.
  • ask Apple to introduce a new Int/Float encoding strategies and wait until this is done (might never happen).

and if you can't tolerate 2.0000002 for 2 or the complexity of num_int / num_float or change your server - your only real option is to write your own JSON encoder.

I've never encountered it and it does seem overly strict, given there's no difference in value between 1 and 1.0, so I'm not sure what the designers think the rule is preventing.

There are other Swift JSON encoders (like GitHub - swift-extras/swift-extras-json: JSON encoding and decoding without the use of Foundation in pure Swift.) but I'm not sure if any do what you want. The one I mention has a limitation where, since it doesn't depend on Foundation, it can't parse Dates as flexibly directly. I also don't know if the performant improvements hold up in newer version of Swift.

Edit: Looking at it, it does look like the encoder I linked will return 1.0 for Double values, but it depends on the description for the value. Seems like something to enhance in that library.

1 Like
quick & dirty one page JSON encoder
func jsonEncoded(_ val: Any?) -> String {
    guard let val = val else { return "null" }
    switch val {
    case let v as String:   return v.quoted
    case let v as Bool:     return v ? "true" : "false"
    case let v as Int:      return String(v)
    case let v as Double:   return String(v)
    case let v as [Any]:
        let elements = v.map { element in
            jsonEncoded(element)
        }
        return "[" + elements.joined(separator: ",") + "]"
    case let v as [AnyHashable: Any]:
        let elements = v.map { kv in
            kv.key.description.quoted + ":" + jsonEncoded(kv.value)
        }
        return "[" + elements.joined(separator: ",") + "]"
    // TODO: handle other types like Int8, Float, etc
    default:
        let m = Mirror(reflecting: val)
        switch m.displayStyle {
        case .struct:
            let elements = m.children.map { child in
                child.label!.quoted + ":" + jsonEncoded(child.value)
            }
            return "{" + elements.joined(separator: ",") + "}"
        case .optional:
            return "null"
        default:
            fatalError("TODO")
        }
    }
}

extension String {
    var escaped: String {
        let components = map { ch -> String in
            let s = String(ch)
            switch s {
            case "\\": return "\\\\"
            case "/": return "\\/"
            case "/n": return "\\n"
            case "/r": return "\\r"
            case "/t": return "\\t"
            case "\u{8}": return "\\b"
            case "\u{c}": return "\\f"
            case "\"": return "\\\""
            // TODO: uXXXX
            // TODO: check valid range
            default: return s
            }
        }
        return components.joined(separator: "")
    }
    
    var quoted: String {
        "\"" + escaped + "\""
    }
}

Writing or using a custom json encoder is probably not an option for this project. Writing a proof of concept one is fun, but sorting out every possible edge case and potential regression for this app (100k+ loc) would probably make it grow out of hand.

It's your call. What I would do in this situation - have a custom encoder running along side the system one, validating it's output, run this setup for a couple of weeks / month, catch all the bugs and then switch once I am satisfied with the result.

value -> standardEncoder -> JSON (use that initially)
value -> customEncoder -> JSON -> standardDecoder -> value2, assert(value == value2)
(drop the result of custom encoder for a while until it is tested.)

in the EQ check there're some obvious gotchas related to floating point comparison.

after the switch it's worth to leave the similar check in DEBUG build:

value -> customEncoder -> JSON
#if DEBUG
    JSON -> standardDecoder -> value2, assert(value == value2)
#endif
1 Like

My mistake! The assumptions I made about JSONEncoder's inner workings were wrong. Very wrong.

I toyed around a bit more and this is a workaround I could come up with:

import Foundation

struct GameCharacter: Codable {
    var name: String
    var skillLevel: Double

    func encode(to encoder: Encoder) throws {
        var container = encoder.container(keyedBy: CodingKeys.self)
        try container.encode(name, forKey: .name)
        // This will encode a string -- not sure where to extend the
        // container's encode so that you could inject your custom
        // character sequence.
        try container.encode(String(format: "%.1f", skillLevel),
                             forKey: .skillLevel)
    }
}

let character = GameCharacter(name: "Kim", skillLevel: 2)

// Worst case: use a regex replacement here, where you remove the
// quotation marks surrounding numbers.
let encoder = JSONEncoder()
print(String(data: try encoder.encode(character), encoding: .utf8)!)

You would have to provide a custom encoder for all the structs/classes for which you want to output decimal numbers, which is not great. Then, you would have to go through the output with a regex and remove the quotation marks around decimal-number values.

It's not a great solution, but it would get the job done.

No, as you point out it will encode the value as a string:

"skillLevel": "2.0"

but it should be a number:

"skillLevel": 2.0

Also, it should not always force to only one fractional digit, it should just make sure the fractional part is never skipped (which currently is the case when the fractional part is zero), it shouldn't change anything else compared to how floating point values are currently encoded.

As far as I can see there is no trivial solution, except using an alternative encoder. Maybe swizzling some methods in JSONSerialization or something?

In regards to the string, I said:

// Worst case: use a regex replacement here, where you remove the
// quotation marks surrounding numbers.

My solution is also just an example. If you change the format in String(format: "%.1f", skillLevel), then you can return more/fewer digits. Or, have and if-then-else ladder that deals with how you would like to encode particular cases.

Compared to custom encoder, swizzling/etc would be much harder, more fragile and very dangerous. This is to get you started if you want to go down that rabbit hole:

Really, a custom encoder is the only realistic way to go in this case unless you can change the parameters of your server or change the server altogether.

1 Like

As far as I can see there is no trivial solution, except using an
alternative encoder.

One option that might work would be to post-process the resulting JSON. Whether this is “trivial” or not depends on the complexity of the JSON you’re working with. In your original example it would pretty trivial, but I imagine that doing it in a real example would be much harder.

You could potentially combine it with indiedotkim’s suggestion, using custom coding to add a tag that makes it easy to find the numbers that need to be ‘patched’. I’ve included a quick example below.

Share and Enjoy

Quinn “The Eskimo!” @ DTS @ Apple

import Foundation

struct Workaround: Codable {
    var rawValue: Double

    func encode(to encoder: Encoder) throws {
        var c = encoder.singleValueContainer()
        // This assumes that `«` and `»` don’t appear elsewhere in your JSON.
        let s = String(format: "«%.1f»", self.rawValue)
        try c.encode(s)
    }
}

struct S: Codable {
    let a: Workaround
    let b: Workaround
}
let s = S(a: Workaround(rawValue: 1.5), b: Workaround(rawValue: 2.0))
let d = try! JSONEncoder().encode(s)
let js = String(data: d, encoding: .utf8)!
let patched = js
    .replacingOccurrences(of: "\"«", with: "")
    .replacingOccurrences(of: "»\"", with: "")
print(patched)
// {"a":1.5,"b":2.0}
2 Likes

I quickly googled "swagger strict validation" - it feels like this is a user setting that's even off by default (as most google queries ask about "how to enable it") or at least configurable to be off (perhaps on the server but might be per client request). Is this not the case?