I've been taking a stab at SR-5982, and so far I've been able to check for whether or not oldValue was used.
Whenever it's not used, I'm changing the didSet
function signature to remove the oldValue
parameter, and removing the corresponding parameter list and call to the getter/property access.
However I'm starting to wonder now if this would affect ABI stability? I'm thinking of a case like so:
//Module A
class Foo {
var a: Int {
didSet {
print("I am using oldValue: \(oldValue)")
}
}
}
//Module B
class Bar: Foo {
var a: Int {
didSet {
print("Didn't use oldValue")
}
}
}
If module A is compiled before the optimisation and module B after, the SIL generated by B wouldn't pass in oldValue
to didSet
in module A.
Would it be possible instead to maybe just pass in a null value for oldValue
? Is it possible to do this with the type checking system at the stage where property observers are synthesised?