Differentiable programming for gradient-based machine learning

I'm going to say something that isn't particularly relevant to the discussion occurring on this thread. Thus, we should make a new thread under Development to discuss it further. It would be helpful to plan for its implementation now instead of waiting until this SE proposal is implemented into release toolchains.

Pitch for a new thread

Making Array, Optional, Float, etc. differentiable is not part of this SE proposal, but I imagine it will come in a subsequent one. There is still a lot to decide, such as what convention we'll go by for differentiating functions that have jump discontinuities. I discussed the idea of pretending we're differentiating the raw assembly instructions with @scanon. As for Array, there are many more collection types that could be differentiated. There has been progress on a DifferentiableCollection protocol, which might be possible to merge if its blocking compiler crashers have been fixed. There are also many functions of floating-point types that do not have derivatives, and a vulnerability in their test suite that allows manual derivatives to not propagate the pullback. One more thing to discuss is the unofficial Differentiation Swift package, and its role in allowing differentiation of the Stdlib before a release toolchain officially supports that.

With my current skill set, I'm much more inclined to working on the standard library than helping push ABI stability forward. So I would be very interested in having another thread where we could exclusively discuss differentiation of standard library types. Is this discussion viable at the moment?