Status of generalized existentials

Here's a concrete use case that demonstrates that the design of generalized machine learning APIs is blocked by generalized existentials (while there are other language limitations, GE is a big one).

Given a data structure representing a parameterized model, there needs to be a collection containing all parameters in which each element is partly type-erased but capable of math operations.

typealias AnyFloatingPointTensor = TensorProtocol where Scalar : FloatingPoint

struct MyMLModel {
  var w1, b1, w2, b2: Tensor<Float>
  var w3, b3: Tensor<BFloat16>
  var allParameters: [AnyFloatingPointTensor] {
    return [w1, b1, w2, b2, w3, b3]
  }
}

Then in the optimizer code, the user or library designer can write:

for inout θ in model.allParameters {
  θ -= θ * learningRate
}

Here's some more context:

3 Likes