As someone who spends his evenings toying with Rust, one aspect of the language I marvel at is its robust macro facilities, which streamlines significant boilerplate (e.g. Trait-derivations, type generation) while minimizing the number of features needing to be added to the core language.
With certain aspects of the Swift ecosystem now maturing (e.g. ABI stability, SwiftSyntax, SourceKit-LSP), have there been any recent discussions amongst the Core Team on whether to introduce a native metaprogramming model into Swift + what it might look like?
While no idea if it'll become "the way" here's some interesting ongoing work by Eugene Burmako (swift4tensorflow, google): Swift as syntactic sugar for MLIR that you might find interesting (see the attached doc for details).
Hi Konrad! In mid-February, I joined a different team at Google Brain (https://twitter.com/eugene_burmako/status/1229825664032763904), and I haven't been working on Swift metaprogramming since then. Please reach out to @saeta for the current status of Swift for TensorFlow initiatives.