Compiler crash when capturing C++ function parameter in a block

The following code crashes the compiler:

C++ part:

typedef int (^SambleBlock)(AURenderPullInputBlock pullInputBlock);

SambleBlock makeRenderBlock(int y) {
    return ^int(AURenderPullInputBlock pullInputBlock) {
        return y;

Swift part:

func make() {

On another hand, this code works fine:

SambleBlock makeRenderBlock(int y) {
    return ^int(AURenderPullInputBlock pullInputBlock) {
        return 1;

Interstingly, this also works well:

typedef int (^SambleBlock)(int a);

SambleBlock makeRenderBlock(int y) {
    return ^int(int a) {
        return y;

I am not experienced C++ programmer, so I might be missing something obvious here?

This compiles fine for me (Xcode 15.1, swift 5.9.2)

Interesting, I have the same toolchain.

Can you try compiling this project:

I also see now it is somewhat random behaviour. Without any code changes, it sometimes compiles, sometimes now:

This also crashes for me.

Then I created .m (or .mm) and moved "makeRenderBlock" in there – with this change it compiles fine.

BTW, are you using C++ "significantly" (with Swift having to call some C++ functionality)? This is not the case here (note that ^blocks are Obj-C), but as I understand this is a simplified code. On many of my previous projects I used Obj-C interop only while using C++ in .mm files.

Thanks! That sounds like a reasonable workaround.

note that ^blocks are Obj-C
Aren't they technically C as mentioned here?

BTW, are you using C++ "significantly" (with Swift having to call some C++ functionality)?

I was interested in feasibility of using C++ interoperability to avoid AudioUnit Objective-C boilerplate, but also as an intermediate step to theoretical Swift only Audio development.

Blocks are "non-standard" Apple extensions to C, a feature that was ported back from Objective-C.

From my past experience, even if you don't do audio completely in Swift, the interop required is basically Swift <-> C (perhaps with blocks), and obviously C has no problem talking to C++. I'd still put that C++ into .mm to make life easier, but otherwise you don't have to use Obj-C "@interface" or "@implementation" or call objects via Obj-C bracket syntax, etc if that's what you wanted to avoid.

For that you may find this thread and this WWDC video (37:50 ... 42:25) interesting.

Thank you. Yeah, it is possible but requires additional level of indirection since you can only call C functions from Swift.

Out of interest, do you use Swift for realtime code yourself?

Yes, one of my audio engines was in pure Swift including the "realtime" portion (inputProc / renderProc and whatever they call in turn). I tend to minimise the amount of code spent within realtime part, typically it's just filling out the ring buffer with the samples received from the mic or reading the samples out of a ring buffer to send to speaker, although depending on the task at hand you'd have to do more, e.g. in another audio engine we put the filtering pipeline inside the realtime part. While there could be modern "formal" ways to ensure the code doesn't do anything funny like allocating memory or taking a lock (@noAllocations, @noLocks) in my case I just "manually" avoided using anything prohibited. As the wise man in the reference WWDC video tells you can only read memory, write memory and do math †, and that's regardless of the language you use.

(† Bear in mind that macOS or iOS are not hard realtime systems, so there could be audio glitches if you, say, read memory in your renderProc, the memory that happens to be paged out.)

1 Like