ExpressibleBy...Literal and Subtyping

Imagine the following code:

protocol Expression {}
struct Number: Expression {
    let value: Int
}
struct Product: Expression {
    let factors: [Expression]
}
extension Number: ExpressibleByIntegerLiteral {
    init(integerLiteral value: Int) {
        self.init(value: value)
    }
}
let p = Product(factors: [1, 2]) // won't compile, the compiler will assume 1 and 2 are of type Int
let p2 = Product(factors: [1 as Number, 2]) // this will compile

func makeNumberProduct(_ factors: Number...) -> Product {
    return Product(factors: factors)
}
let p3 = makeNumberProduct(1, 2) // this will also compile

So it looks like type inference with the literal types only works if there is an explicit conversion to the exact type expected in an expression and not to a subtype. As soon as you add more context providing explicit type hints (even if its just on one expression in, say, an array), it starts working again.

Is that a known limitation in the compiler? Is this something that can be worked around?

1 Like

Since the initializer expects an array of Expression existentials, its elements could be of *any* type that conforms to the Expression protocol. In your example only one of them is also ExpressibleByIntegerLiteral, but in general there could be arbitrarily-many such types—indeed the conformances could even be added retroactively to existing types.

2 Likes

True, I see how this would be a problem. I guess that's a general problem with existentials?

What seems to work is to do something like

struct Product<T: Expression>: Expression {
    let factors: [T]
}

let p = Product<Number>(factors: [1, 2])

but that of course means that you're much more restricted in the types of products that you can create.

I think I'm running into the expression problem here.

1 Like

Reconsidering, I do think the problem could be made "solvable" in some way in theory.
E.g. if you could somehow say

extension Expression: ExpressibleByIntegerLiteral {
    init(integerLiteral value: Int) {
        Number(value: value)
    }
}

then the compiler wouldn't need to guess the type. But I'm not sure what kind of implications something like that would have.

You might be interested in this previous thread on a similar topic:

1 Like

@yxckjhasdkjh I had the same problem to solve. My expressions are SQL expressions. And I want to let the user mix and match custom expressions with regular Swift values:

Player.select(Column("score") * 2)

Note that I don't use ExpressibleBy...Literal protocols. Instead, I use raw Int, String values, etc.

My solution was to define two protocols. One for specific expressions (columns, etc.), and one for expressions in general:

protocol SQLExpressible { ... }
protocol SpecificSQLExpressible: SQLExpressible { ... }

"Pure" sql types adopt SpecificSQLExpressible:

struct Column: SpecificSQLExpressible { ... }
struct Product: SpecificSQLExpressible { ... }

Regular Swift values adopt SQLExpressible:

extension Int: SQLExpressible { ... }
extension String: SQLExpressible { ... }

When I want to derive an expression from two other ones, I define three variants:

func * (lhs: SpecificSQLExpressible, rhs: SpecificSQLExpressible) -> Product {
    return Product(lhs, rhs)
}

func * (lhs: SpecificSQLExpressible, rhs: SQLExpressible) -> Product {
    return Product(lhs, rhs)
}

func * (lhs: SQLExpressible, rhs: SpecificSQLExpressible) -> Product {
    return Product(lhs, rhs)
}

Note that the (SQLExpressible, SQLExpressible) is avoided, because it would pollute the API of regular types (Int, String, etc.)

You get:

Column("score") * 2                     // Product
2 * Column("score")                     // Product
Column("score") * Column("multipliier") // Product
2 * 2                                   // 4

Maybe this could help you,

1 Like

That's interesting. I've thought about doing something similar, the problem is just that in terms of a DSL I'd like to do something like 2 * 2 in a context where an Expression is expected, but as you said, enabling that would needlessly pollute the standard types. Having to randomly do 2 * Number(2) or Number(2) * 2 would kind of hurt the readability of the DSL, I think it would be strictly worse than Number(2) * Number(2).

I think in your use case it works fine because the SpecificSQLExpressible types are probably used more often than the others.

You are right: our use cases overlap, but are not identical.

And you'll have to decide if Number(1) * Number(2) is better or worse than Product(factors: [1, 2]) :sweat_smile:

Operators do not come without flaws, as a matter of fact, because Swift operator precedence may interfere with the DSL. For example, GRDB had very early support for a && b && c. But I had to add [a, b, c].joined(operator: .and) eventually, when precise SQL generation is desired.