One of the efforts of the Swift for TensorFlow project has been to explore adding features like Automatic Differentiation to the Swift language. This is a powerful capability that can significantly enrich Swift’s potential as a programming language for scientific computing, numerics, and machine learning.
Speaking on behalf of the Swift Core Team, we are interested in evaluating incorporating this capability directly into the Swift language.
As usual, changes to Swift must go through the Swift Evolution process. Further, any proposed change requires an implementation that can be evaluated and understood with its full impact on both (a) the feel of the Swift language and (b) the language’s implementation.
There are various ways to incubate an implementation of proposed changes to Swift. This includes using branches or having the implementation directly on master
behind various kinds of feature gates. Both approaches have been used in the past. For Automation Differentiation, the Core Team is supportive of using the second approach given the deep tie-ins with the rest of the compiler, the likely rapid iteration of the work, and that overall development will benefit from testing and core development being done in master
rather than a long-lived parallel branch.
To summarize:
-
An implementation of Automatic differentiation will be added to the compiler, guarded under a flag (or flags) to indicate it is an experimental feature. This will be done directly on
master.
-
Once implementations are ready, each component of the feature will go through the Swift Evolution process. Until that time, the experimental feature will not be included in official Swift releases.
The implementation of Automatic Differentiation is likely to touch on the compiler, runtime, and Standard Library types. As the experimental implementation is staged in, we’ll figure out how to best factor the experimental nature of this feature into the impacted components.