Is there a way to use Decodable to infer properties from position within a JSON array?

I suspect the answer is "no", but I wanted to see if anyone else had a path for this that could leverage Decodable. I'm working with some JSON data that's not in an object format, just arrays of values. For example [0,1, "A"]. In this case, I've been told index position 0, 1, and 2 have specific meanings, so I wanted to bring this into a struct format with properties identified after it's been "decoded/parsed" into Swift objects.

I didn't see any ways of mapping from index position within a JSON array - and treating that like a property, but is there?

I know it's a corner case in the JSON world, more about reading dense structured data - and right now I'm using swift-json-extra to convert this into "JSONValues" and then writing a little parser to vet the returned structure and convert it into the struct format I'd like to use.

You can do this by manually providing an implementation for your Decodable type.

import Foundation

struct PositionalStruct: Decodable {
    let a: Int
    let b: Int
    let c: String

    init(from decoder: Decoder) throws {
        var container = try decoder.unkeyedContainer()

        if container.count != 3 {
            throw DecodingError.typeMismatch(Self.self, .init(codingPath: decoder.codingPath, debugDescription: "Incorrect number of elements in array"))
        }

        a = try container.decode(Int.self)
        b = try container.decode(Int.self)
        c = try container.decode(String.self)
    }
}

let data = """
[0, 1, "A"]
""".data(using: .utf8)!
let decoder = JSONDecoder()
let decoded = try decoder.decode(PositionalStruct.self, from: data)
print(decoded)
2 Likes

this won't be of much immediate help to you but i've also run into this problem a lot, and iโ€™ve been working on an API for swift-json that would make this use case a lot easier to express.

the PR is pretty much ready to merge, what i am actually blocked on is i don't really have the testing infrastructure to verify that the decoder emits the correct diagnostics, because any Error cannot be compared for equality, and it is very difficult to conform specific error types to Equatable because they often contain nested errors. this is not specific to swift-json, but if you have any advice for how to unit test error throwing in a scalable manner that would be a big help :slight_smile:

With the details/reminder from @ksemianov about providing your own Decoable implementations, I was able to do exactly what I was after.

I'd already hand-cobbled a ruthless parser - but it's a hell of a lot easier to read with the decoder implementation structure. Interestingly, the decoders (even with Swift Extras or other optimized libraries like ZippyJSON) are notably slower than my brutalist parser.

I got a little overexcited, and applied some new experiments with package-benchmark to this exploration:

Throughput (scaled / s)
 โ•’โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•โ•โ•โ•โ••
 โ”‚ Test                                   โ”‚  p0 โ”‚ p25 โ”‚  p50 โ”‚ p75 โ”‚ p90 โ”‚ p99 โ”‚ p100 โ”‚ Samples โ”‚
 โ•žโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•ก
 โ”‚ Custom parse JSON into trace           โ”‚  82 โ”‚  79 โ”‚   78 โ”‚  76 โ”‚  75 โ”‚  73 โ”‚   68 โ”‚     300 โ”‚
 โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
 โ”‚ ExtrasJSON decode JSON into trace      โ”‚   6 โ”‚   6 โ”‚    6 โ”‚   6 โ”‚   6 โ”‚   5 โ”‚    5 โ”‚      29 โ”‚
 โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
 โ”‚ Foundation decode JSON into trace      โ”‚   1 โ”‚   1 โ”‚    1 โ”‚   1 โ”‚   1 โ”‚   1 โ”‚    1 โ”‚       7 โ”‚
 โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
 โ”‚ ZippyJSON decode JSON into trace       โ”‚   8 โ”‚   8 โ”‚    8 โ”‚   7 โ”‚   7 โ”‚   7 โ”‚    7 โ”‚      38 โ”‚
 โ•˜โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•โ•โ•โ•
 Time (wall clock)
 โ•’โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•โ•คโ•โ•โ•โ•โ•โ•โ•โ•โ•โ••
 โ”‚ Test                                   โ”‚  p0 โ”‚ p25 โ”‚  p50 โ”‚ p75 โ”‚ p90 โ”‚ p99 โ”‚ p100 โ”‚ Samples โ”‚
 โ•žโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•โ•ชโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•ก
 โ”‚ Custom parse JSON into trace (ms)      โ”‚  12 โ”‚  13 โ”‚   13 โ”‚  13 โ”‚  13 โ”‚  14 โ”‚   15 โ”‚     300 โ”‚
 โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
 โ”‚ ExtrasJSON decode JSON into trace (ms) โ”‚ 176 โ”‚ 176 โ”‚  177 โ”‚ 178 โ”‚ 180 โ”‚ 184 โ”‚  184 โ”‚      29 โ”‚
 โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
 โ”‚ Foundation decode JSON into trace (ms) โ”‚ 823 โ”‚ 823 โ”‚  825 โ”‚ 825 โ”‚ 825 โ”‚ 825 โ”‚  825 โ”‚       7 โ”‚
 โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
 โ”‚ ZippyJSON decode JSON into trace (ms)  โ”‚ 132 โ”‚ 132 โ”‚  133 โ”‚ 134 โ”‚ 134 โ”‚ 136 โ”‚  136 โ”‚      38 โ”‚
 โ•˜โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•โ•งโ•โ•โ•โ•โ•โ•โ•โ•โ•โ•›

I don't actually need the speed of that hand-parsed code for my use case, but it was interesting to see it since I started there.

1 Like

Yep, Codable has a rather low maximum throughput no matter how good your underlying parser is.

3 Likes