Realtime threads with Swift

Not to overwhelm you with choices, but there might also be a middle approach here—if we emitted the diagnostics during IRGen, at the point we try to emit a call to a realtime-unsafe runtime function, then we should be close enough to the SIL instruction that triggered the emission to get diagnostic location info from it at that point.


@liscio ... coming late to the party and not adding much except a +1 to your comment. I use swift to program microcontrollers (swift for arduino) and have experienced exactly this pain. MCUs are basically hard realtime environments. The 'microswift' I made has a super trimmed stdlib and almost no runtime, it has (almost) no heap allocated types, no ARC, no classes, no closures (except convention(c) function callbacks). It was the only way I could get it to work really. I still struggle with swift unexpectedly emitting loads of rtti type 'metadata' unwanted. That all said, it's been rewarding and it's useable... but the complaint I'm always hearing is basically a polite version of what you said... 'this doesn't even closely resemble the Swift I know!'


actually that's not true now, that was official Apple policy until 2016.

compare this:
to this:

i don't know if anything material change between 2015 and 2016, but my understanding is that for the last 5 years apple no longer non-recommending swift for realtime audio. (just remember not doing anything beyond reading memory, writing memory and math.)

1 Like

Creating a new AU target with the latest version of Xcode uses C++ for the DSP code.

Anyway, I think we're all in agreement that Swift isn't currently good for realtime because of its dynamic allocation behavior.

really, watch that WWDC 2016 Session 507 video if you can still find it (or ask someone to send you the relevant snapshot): i remember it very well - a square wave generator with a render proc written in pure swift. if Apple's Doug Wyatt says it's good - that sounds good to me.

also this: apple systems are not hard realtime. page fault occurs - nothing will stop audio glitch. and this can happen regardless of what language you use C or swift.

you can measure how many glitches say, per hour you are actually getting - inputProc/renderProc has timestamp parameter and if the previous timeStamp.sampleCount + numSamples doesn't match the current timeStamp.sampleCount - that's a glitch. do the two barebones implementation of, say, square wave generator, one in C, another in swift and compare the actual results on a couple of platforms. i wouldn't be surprised if you get no glitches in both implementations. or if you get a few glitches per hour - again in both implementations.

take Apple advice of 6-10 years ago with a grain of salt, especially given they stopped non recommending swift for real time audio 5 years already. obviously i am not saying it's good to use swift containers, or async dispatch or mutexes, etc... just read memory / write memory and math.

Swift would be fine for your single oscillator example which will not put much stress on the system. In my app, for example, users are often pushing it to the limit, and if some dynamic allocations snuck into the audio thread, that would increase the probability of a glitch.

so, in those bare bones tests just put additional processing to push processing to its limits:

func renderProc(...) {

	for i in 0...N {
		some silly no op here

and choose N so you spend, say, 70% of allowed time, which is IOSize / sampleRate.


That's also not a good test because doing that no-op will not allocate. Part of the problem here is that it's a harder to predict in Swift what will allocate, hence the impetus to have some validation.

allocations are easy to avoid, just do not call anything but memmove and math.
and it is equally easy to "validate" by counting those "glitches per hour" if any.

one practical problem you may encounter - looks like you've already have all that massive amount of kernel code in C... yes, you can call that code from swift's input/render proc but ... is that that important?

found the link, interesting bits are around 0:38:

compare and contrast to 2015 video, around 0:49: Audio Unit Extensions - WWDC15 - Videos - Apple Developer


ideally there must be an ability to denote functions / closures / etc with a marker, say "@realtime". then it's no longer a guesswork or a question of following or not following wwdc guidelines - compiler will do the relevant check and either issue an error or refrain from using unsafe constructs in realtime functions. i wonder if there are such precedents in other languages.

(btw, using C or C++ doesn't automatically mean it's realtime safe. there are many things in there that are not (malloc, mutexes, semaphores, STL containers, smart pointers, etc, it's just the language itself is smaller / simpler its runtime is more simple and predictable).


See above my attempts to implement @realtime as a LLVM pass: Realtime threads with Swift - #34 by audulus


Right, and that's the point: have the compiler validate since it's harder to do it manually in Swift.

I've seen so much realtime unsafe audio code out in the field that I think having compiler validation would be very helpful.


yep, @realtime keyword looks a step in the right direction to me. once/if this is baked in the compiler swift can become a true realtime-safe language indeed. until this is done the situation is on par with what realtime programming is done with other languages: carefully avoid certain api calls and language constructs. in case of swift things to avoid would be classes (arc), containers, escaping closures, and many other things. and carefully check the resulting asm for anything suspicious. here is an example of a simple square wave generator and the corresponding intel asm (i used @inline(never) to avoid unwanted loop unrolling to keep the asm simple):

import Foundation
import AudioToolbox
import AVFoundation

func main() {
    let unitDesc = AudioComponentDescription(componentType: kAudioUnitType_Output, componentSubType: kAudioUnitSubType_HALOutput, componentManufacturer: kAudioUnitManufacturer_Apple, componentFlags: 0, componentFlagsMask: 0)
    let unit = try! AUAudioUnit(componentDescription: unitDesc, options: [])
    let hardwareFormat = unit.outputBusses[0].format
    let renderFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: hardwareFormat.sampleRate, channels: 1, interleaved: false)!
    try! unit.inputBusses[0].setFormat(renderFormat)
    unit.outputProvider = renderProc
    try! unit.allocateRenderResources()
    try! unit.startHardware()

func renderProc(actionFlags: UnsafeMutablePointer<AudioUnitRenderActionFlags>, timestamp: UnsafePointer<AudioTimeStamp>, frameCount: AUAudioFrameCount, inputBusNumber: Int, inputData: UnsafeMutablePointer<AudioBufferList>) -> AUAudioUnitStatus {
    let ptr = inputData.pointee.mBuffers.mData.unsafelyUnwrapped.assumingMemoryBound(to: Int16.self)
    var i: Int = 0
    while i < frameCount {
        i = proc(ptr, i)
    return 0
//    0x100003c30 <+0>:  pushq  %rbp
//    0x100003c31 <+1>:  movq   %rsp, %rbp
//    0x100003c34 <+4>:  pushq  %r15
//    0x100003c36 <+6>:  pushq  %r14
//    0x100003c38 <+8>:  pushq  %rbx
//    0x100003c39 <+9>:  pushq  %rax
//    0x100003c3a <+10>: movl   %edx, %r14d
//    0x100003c3d <+13>: movq   0x10(%r8), %rbx
//    0x100003c41 <+17>: movl   %edx, %r15d
//    0x100003c44 <+20>: xorl   %eax, %eax
//    0x100003c46 <+22>: jmp    0x100003c5b               ; <+43> at main.swift:21:13
//    0x100003c48 <+24>: nopl   (%rax,%rax)
//    0x100003c50 <+32>: movq   %rbx, %rdi
//    0x100003c53 <+35>: movq   %rax, %rsi
//    0x100003c56 <+38>: callq  0x100003cf0               ; audio.proc(...) -> Swift.Int at main.swift:29
//    0x100003c5b <+43>: testq  %rax, %rax
//    0x100003c5e <+46>: js     0x100003c50               ; <+32> at main.swift:22:13
//    0x100003c60 <+48>: testl  %r14d, %r14d
//    0x100003c63 <+51>: setne  %cl
//    0x100003c66 <+54>: testq  %rax, %rax
//    0x100003c69 <+57>: setne  %dl
//    0x100003c6c <+60>: cmpq   %r15, %rax
//    0x100003c6f <+63>: jge    0x100003c75               ; <+69> at main.swift:24:5
//    0x100003c71 <+65>: orb    %dl, %cl
//    0x100003c73 <+67>: jne    0x100003c50               ; <+32> at main.swift:22:13
//    0x100003c75 <+69>: xorl   %eax, %eax
//    0x100003c77 <+71>: addq   $0x8, %rsp
//    0x100003c7b <+75>: popq   %rbx
//    0x100003c7c <+76>: popq   %r14
//    0x100003c7e <+78>: popq   %r15
//    0x100003c80 <+80>: popq   %rbp
//    0x100003c81 <+81>: retq

func proc(_ ptr: UnsafeMutablePointer<Int16>, _ i: Int) -> Int {
    let n = (((i >> 7) & 1) << 12) - 0x800
    ptr[i] = Int16(truncatingIfNeeded: n)
    return i &+ 1
//    0x100003cf0 <+0>:  pushq  %rbp
//    0x100003cf1 <+1>:  movq   %rsp, %rbp
//    0x100003cf4 <+4>:  movl   %esi, %eax
//    0x100003cf6 <+6>:  shll   $0x5, %eax
//    0x100003cf9 <+9>:  andl   $0x1000, %eax             ; imm = 0x1000
//    0x100003cfe <+14>: addl   $0xfffff800, %eax         ; imm = 0xFFFFF800
//    0x100003d03 <+19>: movw   %ax, (%rdi,%rsi,2)
//    0x100003d07 <+23>: leaq   0x1(%rsi), %rax
//    0x100003d0b <+27>: popq   %rbp
//    0x100003d0c <+28>: retq


Some interesting discussion. Chipping in as the voice of someone a bit at the coalface of real-time (I think), adding an attribute appeals to me, a bit like @convention(c) which I often use in my API. I think I’ll have more to say after I’ve had time to play with adding this pass in my compiler (having trouble building on my Mac M1). It still feels a big ask to add to the main line compiler at this early stage though. It’s ok for custom compiler builds like my forks. I think the need is real but we might find a more complete engineering approach in due course that might make more sense to add to mainline. I want something the wwdc engineers would be comfortable introducing in “what’s new in swift”! :slight_smile:

Another approach might be adding a command line flag to the compiler like —enforce-real-time=true that indicates intent. At first it could add an llvm pass similar to the @realtime above, to make sure only a subset of runtime marked as “real-time safe” (ie predictable, stable and proportionate in execution time... O(1) ish) was allowed and others caused a compiler error. Later that flag could be expanded in use to disable COW somehow, maybe load a slightly different standard library for your target, etc. It would have the advantage of making a catch all and you should know when you’re compiling if a file is intended to produce hard realtime code or not.

For me, I think the fundamental question is “how good is swift at writing bare metal, type, system type code?”.. where you’re comparing to C or Rust. I would love the answer to be “just as good, in realistic use”. So if someone wants to write linux device drivers in swift and they have never touched ios or macs or even UI code themselves, it’s still a natural and painless choice... they gain the defined behaviour and modern syntax of swift and much of the clean architecture of the swift standard library... their code is more accessible to a new generation of coders.

I meant this post to be short... typical me. :slight_smile:


with @realtime or a similar construct it would not be just "as good as C"... it would be better! e.g. one would be able to use malloc, or mutex lock, or dispatch async, etc in C, but not in realtime swift function...

(footgun is always possible though)
while someCondition {
   ... some long loop here where you spend more than allowed time ...
1 Like

@tera late comer to this thread... .

I believe the "no Swift for Audio" is still policy. Probably the most recent sign of this is the compiler error you get if you want to use the new (since iOS 14) Audio Workgroups in Swift.

Yeah. I also have it on good authority that it's still the case. Wish I could say more.

Hi @audulus. I'm patching my local tree with your llvm pass, as it's useful info. Great work!

Looking at Comparing apple:main...audulus:realtime · apple/swift · GitHub, I saw commits from 4f8f6b9 to 3ef8bd8. I put them into a diff then tried to apply it to my tree. They were for 5.5 or newer and I'm still on 5.3 so I had to do a bit of tweaking but I got it working.

I'm getting output like...

main.swift:26:9: warning: variable 'timer' was never mutated; consider changing to 'let' constant
    var timer = timers[timerNumber]
    ~~~ ^
Validating function main.testAllocations() -> () for realtime safety.
Function main.testAllocations() -> () contains swift runtime calls.

Which is a great start. Your question to @Joe_Groff on June 4th: "How can I report nice errors/warnings to the user from my LLVM pass (including source line numbers)? "

Did you make any progress with his suggestions for adding it to diag so I can report a proper warning or error to the user/developer?

I couldn't see that in the tree I'm looking at?


1 Like

I'm trying to work out how to output diagnostics,

F.getParent()->getContext().Diags.diagnose(SourceLoc(), diag::realtime_detected), name);

I can't get progress. Is it possible to output diagnostics from the llvm passes or is it too degenerate?

(specifically, in other places I think it's normal to use the ASTContext but I suspect it's not available in LLVM passes?)