Is Swift really not performant enough for realtime audio?

I wanted to create plugins for Renoise on macOS, and planned to use Audio Units, written in Swift, but Apple says nope:

...the plug-in’s digital signal processing (DSP) logic [...] is written in C++ to ensure real-time safety. Because Swift can’t talk directly to C++, the sample project also includes an Objective-C++ adapter class [...] to act as an intermediary.

This pattern of using C++ on the audio thread, with some kind of glue code, so we can still use Swift for the GUI, is repeated elsewhere. Apple doesn't want us to use Swift for realtime audio.

I don't understand why this is. If Swift has deterministic memory management, and high throughput, writing DSP logic in Swift should be perfectly workable.

Am I missing something?

2 Likes

It’s tricky. Swift is deterministic, and can generate code equivalent to C++ realtime code…but unlike C++, there are a bunch of ways to implicitly do something not real-time safe:

  • All generics (including some types) are effectively virtual dispatch that an optimized build may or may not devirtualize
  • All any types (plus Any itself) formally require the runtime to use
  • Anything that uses classes, or is implemented using classes (like Array or String), is likely to implicitly do reference counting unless the compiler can prove the value can’t be deallocated. This isn’t as bad as heap allocation or arbitrary runtime calls, but it’s still an atomic operation on the CPU.
  • All errors use heap allocations

So while you can get real-time code out of the Swift compiler, the language is almost actively working against that actually happening, because that’s the right trade-off in other circumstances. Therefore, Apple formally considers Swift not real-time safe, and we hope some day there will be a way to mark certain functions or ranges of code as “real-time safe only”, to protect against mistakes and also instruct the compiler to change what code it emits.

21 Likes

It's not about performance, it's about latency (delay) in the worst case scenarios. For example if you were to use quicksort algorithm in a real time system you'd have to assume quicksort's worst case time complexity which is O(N^2), even if on average quicksort has O(N log N) time complexity.

Most strictly speaking, true realtime can't be achieved on a system with virtual memory or caches, unless you assume the worst case scenario for those subsystems, effectively assuming "cache hit" never happening. Thankfully in application to audio, strict realtime is not needed, and if you have a glitch once in a while, e.g. because you've launched too many apps and VM starts struggling – this is typically acceptable.

You can use Swift in audio realtime: Delivering an Exceptional Audio Experience, WWDC 2016 (relevant bits: 29:00 - 32:40, 37:50 - 42:20), but you are very limited in what you can do and it's kind of walking through a minefield. More details in this thread.

3 Likes

Thank you @jrose and @tera. That's a shame, but it makes a lot more sense now. Much appreciated.

I came to Swift from Web development, so have a lot to learn still.

1 Like

If you want to live on the cutting edge, there are experimental performance annotations you can put on a function. @_noAllocations func foo(...) will cause the compiler to raise an error if foo does anything that may potentially lead to a heap allocation. One of the intentions of this feature is that it can be used to annotate realtime media processing code in order to make it safe to write in Swift with a strong guarantee that it doesn't allocate.

11 Likes

@Joe_Groff - That's very interesting. Thank you.

If you want to live on the cutting edge...

I'm looking to write audio plugins that basically personalize a DAW. They support Lua scripting, but I want to do some DSP, which requires a VST or AU plugin that can run in the audio thread. I could see myself blogging about it and sharing code, but not shipping my own plugins as a product (they're essentially performant scripts), so it's an ideal place to try out experimental features. I'll look into it some more. Thanks again.

@Joe_Groff - Sorry to be a pain. Have you used -experimental-performance-annotations inside XCode before? I added the flag to the Compiler Flags column in Build Phases | Compile Sources (for the file containing the @_noLocks and @_noAllocation annotations), but XCode keeps complaining:

Use -experimental-performance-annotations to enable performance annotations

That may be the Clang compiler flags. Additional Swift flags would go in Build Settings under Other Swift Flags.

3 Likes

Ahh. Thank you. You're awesome!

Edit: Got it working now. Thanks again, @Joe_Groff.