As before, I generally support the careful thought of people who understand the problem far better than I do, and ultimately defer to their judgement with an uncertain +1. As before, I tend to like the some T
syntax (especially in relation to any T
). I find that terminology useful as I puzzle my way through this.
As before, I have the nagging feeling that something doesn’t quite sit right — and a second nagging feeling that it’s only my short lookahead on the whole generics adventure that gives me this feeling. I hope you don't mind me asking a bunch of muddled questions about where this is heading….
The big “Improving the UI of Generics” post was tremendously helpful. (Thank you, Joe!) This example in particular (paraphrased) was compelling:
func bar(x: Collection) {
// not type-safe, since the existential Comparable might not match the existential Collection's Index type
var start = x.startIndex
// somebody could do this and change `start`'s dynamic type:
// start = y.startIndex
var firstValue = x[start] // error
}
That snapped into focus for me why existentials lead to pitfalls with associated types, and why the “same type” guarantee of opaque matters in practice for tasks that aren’t contrived or esoteric.
This example also raises many more questions! I wonder, for example, about this code (using the hypothetical any/some syntax), which if I understand correctly would not compile:
func nthOfEach(
_ n: Int,
from heterogeneousCollections: [any Collection]
) -> Any {
return heterogeneousCollections.map { collection in // collection is `any Collection`
let nthIndex = collection.startIndex.advanced(by: n)
// error: inferred static type of nthIndex is something like `any Collection.Index`,
// so compiler can’t guarantee it indexes `collection` … correct?
return collection[nthIndex]
}
}
Would this then also not compile, for the same reasons?
return heterogeneousCollections.map { collection in
return collection[collection.startIndex.advanced(by: n)]
}
…or would it be reasonable for the compiler to infer types something like this so that it did compile?
return heterogeneousCollections.map { (collection: some Collection) in
return collection[collection.startIndex.advanced(by: n)]
}
Could the compiler then infer the type of nthIndex
to make even the first example compile?
return heterogeneousCollections.map { (collection: some Collection) in
let nthIndex: some Collection.Index<…inferred constraints to match collection…> =
collection.startIndex.advanced(by: n)
return collection[nthIndex]
}
Should it make that inference even if nthIndex
were a var
?
I ask these questions for two reasons. First, the focus just on opaque result types seems to be missing the big picture: still a dead end in the type system, but now just shifted back one step in the great chain of value passing. Despite the manifesto’s big picture, I’m nervous that this proposal is hill-climbing instead of global optimization.
Second, even with the big picture of the manifesto, my last two examples seem to fit into the intuitive spirit of opaque types but not the particular “reverse generics” model. They do seem like opaque some T
types should fit: there is a particular collection type, it has a particular index type, we don’t know what those types are, but they are still stable and mutually consistent. But does the “reverse generics” model still fit? Consider:
for collection in heterogeneousCollections {
print(collection[collection.startIndex.advanced(by: n)])
}
This is correct code, and it would be nice if it type checked without fuss or muss — but collection
needs to take on a different some Collection
type for each iteration of the loop. It would be intuitively reasonable for the language to do this — but I can’t imagine what it looks like under the hood for the compiler!