Protocols question: how can I achieve covariance on a requirement's implementation?

Why does the following not compile:

class A {}

protocol P {
    var a: A { get set }
}

final class B: A {}

final class X: P {      // Error: Type 'X' does not conform to protocol 'P'
    var a: B               // Error: Candidate has non-matching type 'B'
    init(a: B) {
        self.a = a
    }
}

Class B is a sub-class of class A. Liskov's Substitution Principle says anything that conforms to B, also conforms to A. So... why doesn't this compile?

I can "fix" this by adding:

extension P where Self: X {
    var a: A {
        get {
            self.a
        } 
        set {
            self.a = newValue    // compiles, but crashes if it actually gets called
        }
    }
}


var x: P = X(a: B())
x.a = B() // crashes!

Why does this compile? Clearly it allows X to satisfy P but now if the setter gets called it will crash anytime newValue doesn't conform to B.

It can be worked around with a guard statement in the setter to return if newValue as? B fails, but since the compiler knows self.a is of type B, it seems like this should fail to compile otherwise.

Note: if P is made to be an @objc protocol and a is made to be an optional requirement, now, rather than crashing, the compiler will state that a is immutable even though it is declared as get set:

import Foundation

class A: NSObject {}

@objc protocol P {
    @objc optional var a: A { get set }
}

class B: A {}

@objcMembers
class X: P {  
    var a: B      
    init(a: B) {
        self.a = a
    }
}


var x: P = X(a: B())
x.a = B() // Compiler error: Cannot assign to property: 'x' is immutable

This line is the problematic one. Because a has a get set requirement in P, your attempted protocol conformance would allow setting X.a (of type B !) to a value of type A or even class C: A. And that is clearly a type error.

Put differently, you were asking for covariance, but only get-only properties could be covariant; a set requirement makes them contravariant too and that's why a must be invariant in P.

4 Likes

Note that Swift protocol requirements don't support covariant, yet. It wouldn't work even if a is get-only—X.a still needs to have the exact same type as P.a. There may be some exceptions around @objc protocols, though I don't exactly know.

1 Like

I was able to "fix" this with the following extension:

extension P where Self: X {
    var a: A {
        get {
            self.a
        } 
        set {
            guard let a = newValue as? B else { 
                return
            }
            self.a = a  
        }
    }
}

Now, it won't crash when setting a.

I think this illustrates why covariance can't be supported for the {get set} case. If a consumer of the API who has an instance c of class C: A gets passed an instance p of existential type P whose underlying type is X, and then they call p.a = c, this obviously can't work because X.a is of type B (not C).

However for the { get }-only case, I feel that Swift should allow covariance over the a: A requirement because of Liskov's Substitution Principle. I.e., this should compile (but doesn't):

class A {}

protocol P {
    var a: A { get }
}

class B: A {}

final class X: P {    
    var a: B           
    init(a: B) {
        self.a = a
    }
}


var x: P = X(a: B())
print(x.a) 

Of course there are a number of easy workarounds to this limitation, but I'm not sure why Swift cares here. If there was a visibility or publicness difference between A and B then it would make sense but considering they're all the same level, I'm curious why it has to be this way. (I'm sure there's a good reason but it eludes my powers of inference :D)

It should. IIRC, it's just that nobody got around to implement it.

1 Like

…and it would now be a breaking change to do so. :sob: Something to consider for Swift 6! (The tracking issue is SR-522.)

1 Like

Curious why would it be a breaking change? Seems like the only effect would be to make certain code compile, which previously couldn't compile, but I'm sure as always there's a hidden gotcha that I'm not seeing. What is it?

It’s possible that a protocol requirement could be provided by a default implementation, which in turn is used to infer an associated type from one of its arguments. If that requirement now matches the covariant implementation in the adopting type, the associated type could be inferred differently.

I feel like there was a simpler example too but it’s escaping me at the moment.

1 Like

As always, the devil is in the details. Thanks Jordan

1 Like