Swift pointers to value types

In traditional metaprogramming ML frameworks, optimizer APIs have never been a hard problem because an optimizer builder always runs a host language loop to build graph components for each parameter update at runtime. However in Swift for TensorFlow, there is no way to build generalized ML optimizers today because the TensorFlow extension is a "non-metaprongramming ML framework". Beyond synthesizing a computed property setter/getter and supporting inout pattern, we also need the compiler to be able to reliably unroll loops like for inout theta in parameters. Here are some of the problems:

  1. If we replace floats with tensors in the snippet I gave above, we'd expect Graph Program Extraction to turn each parameter update statement into TensorFlow graph operators without generating any send/receive communication with the host Swift program. This would require the compiler to unroll the loop before the partitioning phase of Graph Program Extraction. The foundation for that is compile time evaluable constant expressions.

  2. Not all parameter tensors have the same element type. So if we had the computed property synthesis via @parameter:

    struct Model : Parameterized {
      @parameter var v1: Tensor<Float32>
      @parameter var v2: Tensor<BFloat16>
    }
    // ...
    for (inout theta, g) in zip(model.parameters, gradients) {
      theta -= g * learningRate
    }
    

    We would expect the type of each parameter to be a protocol existential type, generalized over math and other tensor operations. This alone requires generalized existentials. To support existentials, we'll also need to extend constant expression model further.

Therefore, high-level ML APIs, including optimizers, are non-trivial in Swift for TensorFlow. We decided to remove the optimizer prototype to make clear that we have not solved the ML optimizer problem, and the project has to focus on the fundamental technology, including some building blocks mentioned above, to make generalized high-level APIs possible in the future.

2 Likes