Differentiable programming for gradient-based machine learning

I don’t know if there’s a new thread now, but I just wanted everyone to know what @philipturner already read elsewhere:

There might actually be a way to move forward with ML/DL without a special autodiff toolchain. I‘m not sure about the performance of my more high level solution, but I‘m going to assess that.

Regarding apples to apples comparisons, I would hope that a good library in a good language can identify the hardware it needs and then be written against that hardware. If we’re going to assess the performance of a library, we should be using the hardware that the leaders of the field are using. Haven’t worked with swift outside the apple ecosystem a lot, don’t know how well that works by now, but I think this will be a requirement here.

It's been a little while, but to follow up on John's suggestion I've created a separate Development thread in an attempt to aggregate information about ongoing implementation work for differentiable Swift. Hopefully that will help to keep the discussions here focused on the pitch itself.

8 Likes