Found Swift through Swift for Tensorflow. Programming background: Python, C++, Haskell

I am going to tag this as "off-topic" with hopes that people can steer me in the right direction.

I quickly read through the syntax guide and figured out how to deal with higher order functions, a Functor protocol, simple generics, pattern matching etc. The language looks really fun.

I am curious about folks doing data science with Swift who have come from Python. What libraries are you using to replace numpy, scipy, pandas? Even if they are incomplete. A big deal for me is that they don't depend on Accelerate (on Linux). I found Nifty, but development looks stalled.

Lastly, is anyone thinking about cross-platform implementation of Accelerate. I don't think this is something I could do, but it would be really useful from a usability standpoint. Personally, the vDSP library would be immensely useful.

4 Likes

This would be quite an undertaking. It's not impossible, but (parts of) Accelerate predates OS X; it represents 20+ years of development by a dedicated team. Pieces of it have good open-source analogues: OpenBLAS is pretty mature implementation of the BLAS. The open-source LAPACK project is high-quality, and has C bindings (and here you have a small advantage over Accelerate, which can't take breaking API changes and so is fixed on an older version). FFTW is an option for FFTs if the license is compatible with your projects. Another alternative is Intel's MKL libraries, which are available for multiple platforms.

There are also some pieces of Accelerate, like vImage, that don't have a real open-source analogue and which would be a significant challenge to re-implement.

All of these things have C API, which imports easily, if not idiomatically, into Swift. Defining Swift interfaces for these types of operations is something that a number of people are interested in on the forums here.

You can also fall back on using NumPy via Python imported into Swift; if you're coming from a Python background, that should feel pretty comfortable.

Looking forward, the Swift roadmap for these computational needs should have two complementary pieces:

  • Define Swift abstractions that let us wrap existing libraries to provide useful functionality in Swift, with approachable interfaces and without unnecessary abstraction costs.
  • Define the building blocks that allow us to write these types of libraries in Swift itself.
10 Likes

Hi Steve,
This is an off-topic reply to an off-topic thread, but I'm wondering whether you can elaborate on the history of Accelerate that predates OS X. I'm just interested for my own curiosity. I of course understand if it's not possible for you to say more.
Thanks for the detailed explanation,
Dave

Yes, I implement it.

1 Like

Accelerate grew out of vecLib, which originally shipped on Mac OS 9.something. Some of the API defined by the component libraries are much older than that, of course; vDSP evolved from Mercury Systems' SAL, and parts of the BLAS API date back to the 1970s.

3 Likes

This is really helpful. I think I would like something like GitHub - mattt/Surge: Surge has been moved to its own organization on GitHub (@Jounce) with MKL in the back-end. I am familiar enough with MKL to be able to port it (I think).

You can also fall back on using NumPy via Python imported into Swift; if you're coming from a Python background, that should feel pretty comfortable.

Part of my curiosity is if Swift for Tensorflow takes off what would it look like to do all my work in Swift.

This is interesting. Have you done benchmarks comparing this to Surge (GitHub - mattt/Surge: Surge has been moved to its own organization on GitHub (@Jounce))?

There is a Benchmark playground which’s comparing my fft implements and Apple’s vDSP framework.

These "playgrounds" are Xcode only right? I don't own a Mac unfortunately.

The result from my mac mini.

Real to complex FFT:

let t1 = HalfRadix2CooleyTukey_OutPlace_Benchmark(n)
let t2 = HalfRadix2CooleyTukey_InPlace_Benchmark(n)
let t3 = RealRadix2CooleyTukey_Benchmark(n)
let t4 = vDSP_fft_zropD_Benchmark(n)
let t5 = vDSP_fft_zripD_Benchmark(n)

2 | 1.7000000000000002e-06 | 3.0000000000000004e-07 | 4.0000000000000003e-07 | 1.2000000000000002e-06 | 1e-06
3 | 1.5e-06 | 9.000000000000001e-07 | 1.9000000000000002e-06 | 1.9000000000000002e-06 | 1.7000000000000002e-06
4 | 9.000000000000001e-07 | 1.5e-06 | 1.2000000000000002e-06 | 2.4000000000000003e-06 | 1.8000000000000001e-06
5 | 1.7000000000000002e-06 | 1.9000000000000002e-06 | 2.2e-06 | 2.7e-06 | 1.7000000000000002e-06
6 | 2.4000000000000003e-06 | 2.3000000000000004e-06 | 2.4000000000000003e-06 | 2.2e-06 | 1.8000000000000001e-06
7 | 3.3e-06 | 3.6000000000000003e-06 | 3.4000000000000005e-06 | 2.9e-06 | 1.6000000000000001e-06
8 | 5.4e-06 | 5.7000000000000005e-06 | 5.8e-06 | 3.8000000000000005e-06 | 2.5e-06
9 | 1.0300000000000001e-05 | 1.02e-05 | 1.09e-05 | 6.300000000000001e-06 | 2.9e-06
10 | 2.0900000000000003e-05 | 2.0900000000000003e-05 | 2.2100000000000002e-05 | 1.1400000000000001e-05 | 4.600000000000001e-06
11 | 4.08e-05 | 4.1800000000000006e-05 | 4.33e-05 | 2.29e-05 | 7.7e-06
12 | 7.64e-05 | 6.910000000000001e-05 | 6.71e-05 | 3.5300000000000004e-05 | 1.0700000000000001e-05
13 | 0.00016560000000000001 | 0.0001245 | 0.000131 | 0.0001217 | 4.4800000000000005e-05
14 | 0.0003213 | 0.0002793 | 0.000299 | 0.00022310000000000003 | 5.7400000000000006e-05
15 | 0.000617 | 0.000587 | 0.0006297000000000001 | 0.0004151 | 0.0001393
16 | 0.0013534 | 0.001573 | 0.001642 | 0.0014319 | 0.0003993
17 | 0.0035048 | 0.0034715 | 0.0039751000000000005 | 0.002855 | 0.0007274000000000001
18 | 0.007265700000000001 | 0.006917 | 0.0077349 | 0.0087304 | 0.0026859
19 | 0.0201585 | 0.0213407 | 0.0208329 | 0.0202306 | 0.006303100000000001
20 | 0.0444022 | 0.04054870000000001 | 0.0452625 | 0.048863 | 0.0182632

Complex to complex FFT:

let t1 = Radix2CooleyTukey_OutPlace_Benchmark(n)
let t2 = Radix2CooleyTukey_InPlace_Benchmark(n)
let t3 = vDSP_fft_zopD_Benchmark(n)
let t4 = vDSP_fft_zipD_Benchmark(n)

2 | 1.8000000000000001e-06 | 1.5e-06 | 1.6000000000000001e-06 | 1.5e-06
3 | 1.4000000000000001e-06 | 1.3e-06 | 1.6000000000000001e-06 | 1.5e-06
4 | 1.5e-06 | 1.5e-06 | 5.300000000000001e-06 | 1.8000000000000001e-06
5 | 2.1000000000000002e-06 | 2.1000000000000002e-06 | 1.7000000000000002e-06 | 1.7000000000000002e-06
6 | 3.6000000000000003e-06 | 2.9e-06 | 2.2e-06 | 2.2e-06
7 | 4.7e-06 | 4.9000000000000005e-06 | 2.5e-06 | 2.1000000000000002e-06
8 | 9.5e-06 | 9.3e-06 | 3.5e-06 | 2.7e-06
9 | 1.9900000000000003e-05 | 1.79e-05 | 6.100000000000001e-06 | 3.7e-06
10 | 3.680000000000001e-05 | 3.4399999999999996e-05 | 6.4000000000000006e-06 | 5.9e-06
11 | 7.07e-05 | 6.890000000000001e-05 | 1.25e-05 | 1.1700000000000001e-05
12 | 0.0001497 | 0.0001383 | 2.68e-05 | 2.32e-05
13 | 0.0003311 | 0.0002754 | 5.2600000000000005e-05 | 4.64e-05
14 | 0.0005857000000000001 | 0.000644 | 0.00012330000000000002 | 0.00010980000000000001
15 | 0.0014172 | 0.0015102 | 0.00032060000000000004 | 0.00029160000000000004
16 | 0.0031202 | 0.0031418 | 0.0008572 | 0.000752
17 | 0.0073284000000000005 | 0.0078984 | 0.0021658000000000003 | 0.002151
18 | 0.0189358 | 0.018451600000000002 | 0.0060933 | 0.006964400000000001
19 | 0.040065 | 0.04302860000000001 | 0.0156583 | 0.014695100000000001
20 | 0.09187370000000002 | 0.08617690000000001 | 0.034136900000000005 | 0.0365635

Earlier tonight I heard that Jupyter has Swift support now (or will in the near future, I don’t recall which). You might be able to do something with that.

1 Like

Googling "jupyter swift" brought up this: GitHub - google/swift-jupyter

2 Likes

I am waiting on a PR for swift for Tensorflow. There is an issue with their Python 3 scripts for setting up swift. But, when that is available it should be useful.

There is also a LSP (GitHub - apple/sourcekit-lsp: Language Server Protocol implementation for Swift and C-based languages) which I am going to try with vscode (likely this evening)

1 Like