Trajectory for evaluating adding Automatic Differentiation to Swift

Update: ~all code related to differentiable programming in Swift has been upstreamed to apple/swift master. Differentiable programming features are available from master toolchains, gated by import _Differentiation:

import _Differentiation
import Darwin

func foo(_ x: Float) -> Float {
  cos(x) + sin(x)
}
print(gradient(at: 10, in: foo)) // -0.29505038

Projects like Swift for TensorFlow, apple/swift-numerics, and borglab/SwiftFusion are using and experimenting with these features.

We're continuing differentiable programming development on apple/swift master branch, and we'll upstream remaining AutoDiff tests over time.


It took us a few months to upstream all our code from apple/swift tensorflow branch to master branch.

Thank you to everyone who helped review our code! Folks like @codafi and @Michael_Gottesman gave much-appreciated suggestions on various code improvements - we'll stay in touch and work on addressing them.

Currently, the Swift Differentiable Programming Manifesto remains our main documentation. We plan to start writing usage documentation soon, as well as posting monthly progress updates with the #autodiff tag. As @rxwei says, maybe differentiable programming will be ready for Swift Evolution next year.

Let us know if you have any questions. Cheers!

37 Likes